Monthly Archives: October 2013

Infosys ran ‘unlawful’ visa scheme, U.S. alleges in settlement

In a settlement announced today, the U.S. government alleges that offshore outsourcing giant Infosys violated visa laws to increase its profits, reduce visa expenses and avoid tax liabilities.

But the allegations, and the evidence the government says it has to back them up, will remain just that — allegations. Instead of pursuing a court case, the U.S. will accept a $34 million settlement payment from Infosys.

That’s how the government typically handles visa violation claims, but in this instance, the government said it is the largest settlement ever reached.

Infosys denies any wrongdoing in the agreement, which is signed by both parties.

This settlement figure represents a fraction of the $6.99 billion in 2012 revenue generated by the Bangalore, India-based IT service provider.

The government action addresses the company’s use of the B-1, or business visitor visas, for work that allegedly required H-1B visas.

Generally, a B-1 visa is intended for short-term visits to the U.S., such as trips to attend conferences or negotiate contracts. B-1 visas are relatively easy to get and aren’t subject to any of the caps, fees, or wage requirements that govern use if H-1B visas.

Infosys, which applies for thousands of H-1B visas every year, “unlawfully” supplemented its workforce with B-1 visa workers, according to the U.S. complaint, which alleged that Infosys wrote letters to U.S. officials with “false representations regarding the true purpose” of a B-1 worker’s activities. For example, the company would tell officials that a worker was coming in for discussions, when the real reason was to do work that required an H-1B visa, such as programming. Visa applicants were also told what to say to avoid suspicion, according to the government.

“We will not tolerate actions that mislead the United States and circumvent lawful immigration processes, whether undertaken by a single individual or one of the largest corporations in the world,” said John Bales, U.S. attorney for the Eastern District of Texas, whose office conducted the investigation. “The H-1B and B-1 visa programs are designed and intended to protect the American worker, and we will vigorously enforce the requirements of those programs,” he said in a statement.

In the settlement, Infosys denies any wrongdoing and says its B-1 visa use “was for legitimate business purposes and not in any way [designed] to circumvent the requirements of the H-1B program.”

The company reiterated that position in a statement Wednesday and said it “disputes any claims of systemic visa fraud” and other claims, arguing that such claims are “untrue and are assertions that remain unproven.”

Infosys says that “only 0.02% of the days Infosys employees worked on U.S. projects in 2012 were performed by B-1 visa holders.” But that’s a reference to visa use in the year after the government launched its probe, which started in 2011.  Infosys has not disclosed the size of its U.S. workforce, or the percentage of its workforce on visas.

The person who triggered the investigation was Jay Palmer, an Infosys employee and Alabama resident who filed a lawsuit against the outsourcing company that drew attention to its use of the B-1 visa.

Palmer said that after he refused to participate in a plan to use B-1 visa workers for jobs requiring H-1B workers, he was threatened and harassed. Palmer’s case in federal court didn’t make it to trial because of provisions in Alabama’s “at-will” employment law. In dismissing the case, U.S. District Court Judge Myron Thompson  said he couldn’t rewrite state laws but nonetheless made it clear that the alleged threats against Palmer were “deeply troubling.” The case didn’t touch on visa law, which is what a Texas grand jury was looking into.

Palmer’s attorney, Kenneth Mendelsohn, said Palmer “was the guy who had the courage to stand up.

“There were many people in Infosys that knew this was going on and just turned the other cheek,” said Mendelsohn. “Jay just morally could not do that, even at the risk of harassment and threats.”

Palmer gave credit to the federal investigators. “Today is not about me, today is about Ed Koranda, and Tim Forte and the U.S. government and their findings and enforcement of the law,” said Palmer, in an interview, referring to the two special agents who investigated the case.

Palmer also said that he “harbors no hard feelings” toward Infosys Executive Chairman Narayana Murthy.  “I only wish he would have reached out to me over the last 2.5 years,” he said.

“What people don’t understand is I tried to fix the problem before I got an attorney, before I turned them in,” Palmer said.

The settlement doesn’t affect Infosys’ ability to obtain visas in the future. But it’s unclear whether this agreement is the end of the company’s problems.

The government’s settlement said that Infosys circumvented visa law “for the purposes of increasing profits, minimizing costs of securing visas, increasing flexibility of employee movement, obtaining an unfair advantage over competitors, and avoiding tax liabilities.” These allegations could invite a closer look from the Internal Revenue Service or the U.S. Securities and Exchange Commission.

Infosys, as part of the settlement, also agreed to improve its visa compliance procedures.

Ron Hira, a public policy professor at the Rochester Institute of Technology and a researcher on tech immigration issues, points out that the $34 million settlement represents a “mere 2% of Infosys profits of $1.7 billion last year.”

“Hopefully, policymakers and journalists don’t draw the conclusion that the ‘system works’ because Infosys has settled,” Hira said. “Instead, they should see this for what it is — one small indication of the vast extent to which firms are exploiting loopholes in the visa programs to bring in cheaper foreign workers to displace and undercut American workers.”

John Miano, founder of the Programmers Guild, said “it is great to see the government finally doing something.” He contends that the use of the B-1 visa to import labor is widespread.

“Until the government starts seeking criminal actions against individuals, illegal actions like this will continue,” Miano said.

Retrieved from ComputerWorld


At last! The FAA has seen the light on in-flight electronics

At long last, airline passengers fed up with having to switch off their electronic devices during takeoff and landing will be able to use them freely at all times on a flight,  not just at cruising altitudes.

The FAA has announced that airlines will soon be receiving guidance on how to implement the new rules, which still require devices to be in airplane mode at all times.

“I will be the biggest proponent of following flight attendant instructions,” said FAA administrator Michael Huerta in a Washington news conference. “But I did feel like any regulation that has been around for a long time — a lot has changed in 50 years. Let’s take another look.”

The change won’t take place overnight; each company will ease the restrictions according to their own timelines. Still, it’s a welcome relief for passengers after a years-long battle to overturn the prohibition.

Surprisingly, once an independent panel (whose members included Amazon) signed off on the idea, it didn’t take long for the FAA to follow suit. A decision was expected “within months,” according to my colleague Lydia DePillis, but the government’s announcement today came just five weeks after the panel finished its safety study.

In a statement, the FAA said customers will “eventually be able to read e-books, play games and watch videos on their devices during all phases of flight, with very limited exceptions.”

Requiring that passengers use airplane mode might make life more difficult for flight attendants, who might not be able to check as quickly whether a customer has the feature turned on. But there’s little point in keeping your cell radios enabled anyway, said Huerta.

“They’re going to ping for a signal, and they won’t get one,” he said. “You’re going to arrive at your destination with a dead phone, and I don’t think anyone wants that.”

Retrieved from WashingtonPost

Top U.S. spies defend surveillance of foreign leaders

U.S. intelligence officials on Tuesday defended surveillance of other countries’ leaders, saying such efforts are common practice across the world’s intelligence agencies.

   General Keith Alexander, NSA

General Keith Alexander, director of the U.S. National Security Agency, defends his agency’s surveillance programs before a House committee Tuesday.


Surveillance efforts focused on learning the plans of other national leaders have long been part of U.S. and other countries’ spying efforts, James Clapper, U.S. director of national intelligence, said in response to a lawmaker’s question regarding press reports about U.S. spying on the telephone conversations of other countries’ leaders.

Without commenting on surveillance of specific leaders, Clapper said that targeting foreign leaders has been a “basic tenant” of many countries’ intelligence efforts for the last 50 years. Foreign leaders criticizing the U.S. efforts may be ignoring their own intelligence agencies’ efforts, he told the U.S. House of Representatives Intelligence Committee.

“Is this something new and different that the intelligence community might try to target foreign leaders’ intentions, to try to determine what the best policy might be for the United States?” asked Representative Mike Rogers, a Michigan Republican and chairman of the committee.

“It’s one of the first things I learned in [intelligence] school in 1963,” Clapper answered. “It’s a fundamental given in the intelligence business.”

Rogers asked if U.S. allies were targeting the communications of U.S. leaders. “Absolutely,” Clapper said.

Clapper and General Keith Alexander, the U.S. National Security Agency’s director, also disputed press reports saying the agency was collecting the content of telephone calls and Internet communications of tens of millions of Europeans. Press reports suggesting those numbers were based on a misreading of a screenshot leaked by former NSA contractor Edward Snowden, Alexander said.

In many cases, the numbers reported were based on collection done by other countries’ intelligence agencies and shared with the NSA, Alexander said. The press reports saying the NSA was collecting the information were “completely false,” he said.

Rogers complained about the press reports on the NSA’s spying on other national leaders, saying it “certainly has created an international row” based on “very poor, inaccurate reporting.”

Representative James Langevin, a Rhode Island Democrat, asked Clapper and Alexander about the impact of proposals to end the NSA’s bulk telephone records collection in the U.S. Earlier Tuesday, a group of more than 85 lawmakers introduced a bill that would end the telephone records program.

The end of the program would set back U.S. intelligence to the levels of information available before the Sept. 11, 2001, terrorist attacks, Alexander said. “If we take away that program, what you do is create a gap,” he said. “What you’re asking me is, is there a risk? The answer is yes. We know that risk because that’s where we existed on 9/11.”

It would be wrong to change the program because of a perception of civil liberties violations when the intelligence community hasn’t “articulated the program well enough so people understand how we protect their privacy,” Alexander said.

While Clapper, Alexander and other intelligence officials faced a largely friendly panel of lawmakers Tuesday, Representative Jan Schakowsky, an Illinois Democrat, suggested that Alexander was misreading the criticism of the surveillance programs. While Alexander stressed that NSA employees are “patriots,” no one has suggested that they aren’t, she said.

“People have questioned the policies of the NSA … and they have been carried out by patriots,” she said.

The NSA’s surveillance of foreign leaders was kept from the congressional intelligence committees, Schakowsky said. U.S. diplomatic relations have suffered because of those spying efforts, she added.

“There will be changes” to the NSA programs, she told Clapper and Alexander. “What I heard from you was a robust defense, effectively, of the status quo.”

Retrieved from ComputerWorld

What’s holding back the cloud industry?

While cloud enthusiasts roaming the halls of McCormick Place convention hall in Chicago last week at Cloud Connect may be   high on the market, the reality is that many enterprises IT shops are still reticent to fully embrace public cloud computing.

Network World asked some of the best and brightest minds in the industry who were at the event about what’s holding the cloud   industry back. Here’s what they said:

The organization Eric Hanselman, Chief Analyst, 451 Research Group

Cloud sounds like a great idea, but how will it really work when it’s adopted? Hanselman says one of the biggest barriers   is an organizational one. Typically IT organizations are split into groups focusing on compute, network and storage. When   applications run from the cloud, those are all managed from one provider. That means the jobs from each of those groups within   IT may change. How can organizations evolve? “You’ve got to converge,” Hansleman says. That could be easier said than done   with people’s jobs at stake.

cloud computing

Cloud industry thought leaders (from left): Eric Hanselman, Krishnan Subramanian, Randy Bias, Bernard Golden, Andy Knosp

Security and application integration Krishnan Subramanian, director of OpenShift Strategy at Red Hat; founder of Rishidot Research

Security is still the biggest concern that enterprises point to with the cloud. Is that justified? Cloud providers spend a   lot of money and resources to keep their services secure, but Subramanian says it’s almost an instinctual reaction that IT   pros be concerned about cloud security. “Part of that is lack of education” he says. Vendors could be more forthcoming with   the architecture of their cloud platforms and the security around it. But doing so isn’t an easy decision for IaaS providers:   Vendors don’t want to give away the trade secrets of how their cloud is run, yet they need to provide enough detail to assuage   enterprise concerns.

Once IT shops get beyond the perceived security risks, integrating the cloud with legacy systems is their biggest technical   challenge, Subramanian says. It’s still just not worth it for organizations to completely rewrite their applications to run   them in the cloud. Companies have on-premises options for managing their IT resources and there just isn’t a compelling enough   reason yet to migrate them to the cloud. Perhaps new applications and initiatives will be born in the cloud, but that presents   challenges around the connections between the premises and the cloud, and related latency issues.

New apps for a new computing model Randy Bias, CTO of OpenStack company Cloudscaling

If you’re using cloud computing to deliver legacy enterprise applications, you’re doing it wrong, Bias says. Cloud computing   is fundamentally a paradigm shift, similar to the progression from mainframes to client-server computing. Organizations shouldn’t   run their traditional client-server apps in this cloud world. “Cloud is about net new apps that deliver new business value,”    he says. “That’s what Amazon has driven, and that’s the power of the cloud.” Organizations need to be forward thinking enough   and willing to embrace these new applications that are fueled by big data and distributed systems to produce analytics-based   decision making and agile computing environment.

It’s more than just technology Bernard Golden, VP of Enterprise Solutions for Enstratius, a Dell company

The biggest inhibitor to more prevalent cloud computing adoption is that organizations are still holding on to their legacy   processes, says Golden, who recently authored the Amazon Web Services for Dummies book. It’s not just about being willing   to use new big data apps, and spin up virtual machines quickly. It’s the new skill sets for employees, technical challenges   around integrating an outsourced environment with the current platform, and building a relationship with a new vendor. “For   people to go beyond just a small tweak, there needs to be a significant transformation in many areas of the organization,”    he says. “Each time there is a platform shift, established mechanisms are forced to evolve.”

Regulatory compliance Andy Knosp, VP of Product for open source private cloud platform Eucalyptus

One of the biggest hurdles for broader adoption of public cloud computing resources continues to be the regulatory and compliance   issues that customers need to overcome, Knosp says. Even if providers are accredited to handle sensitive financial, health   or other types of information, there is “still enough doubt” by executives in many of these industries about using public   cloud resources. Many organizations, therefore, have started with low-risk, less mission critical workloads being deployed   to the public cloud. Knosp says the comfort level for using cloud resources for more mission critical workloads will grow.   It will just take time.

Retrieved from NetworkWorld

Tech spending hurt this year by Congress

Tech spending in the U.S. will increase by a smaller amount this year than earlier predicted, Forrester Research said today. And it is blaming Congress for the forecast decline.

Instead of rising 5.7% this year, tech spending will increase by just 3.9%, Forrester said.

The federal budget sequester, the government shutdown and the threat of default “has had negative impacts on the economy, has had direct negative impacts on federal tech buying, and has indirect impacts elsewhere on CIOs who simply became cautious,” said Andrew Bartels, a Forrester analyst.

In hardware purchases, for instance, Bartels said that CIOs who might have bought new servers to meet new demand, are instead moving peak loads and special projects to infrastructure-as-a-service providers rather than add capacity. “They are not buying servers they might have otherwise bought, said Bartels in an interview.

In terms of dollars, Forrester expects total U.S. private and public spending on technology to be about $1.243 trillion this year; the 2012 figure was $1.195 trillion.

Sequestration, and federal spending cutbacks in general, are having a major impact on federal IT buying, the TechAmerica Foundation said earlier this month. Federal IT spending has declined from a peak of $80 billion in 2010 to $70 billion in the fiscal year that began Oct. 1.

The White House, in a report last week that cited private estimates, said the 16-day federal government shutdown reduced the growth rate of GDP in this quarter by between 0.2% and 0.6%.

Next year will be better, said Forrester, which is expecting U.S. business and government purchases of IT goods and services to rise by 5.3%, supported by a revived housing sector, “modest improvement” in employment and consumer spending and improved exports, Bartels wrote in a blog post.

Retrieved from ComputerWorld

FCC lays down spectrum rules for national first-responder network

The U.S. moved one step closer to having a unified public safety network on Monday when the Federal Communications Commission   approved the rules for using spectrum set aside for the system.

Also on Monday, the agency directed its Office of Engineering and Technology to start processing applications from vendors   to have their equipment certified to operate in that spectrum.

The national network, which will operate in the prized 700MHz band, is intended to replace a patchwork of systems used by about 60,000 public safety agencies around the country. The First Responder Network Authority   (FirstNet) would operate the system and deliver services on it to those agencies. The move is intended to enable better coordination   among first responders and give them more bandwidth for transmitting video and other rich data types.

The rules approved by the FCC include power limits and other technical parameters for operating in the band. Locking them   down should help prevent harmful interference with users in adjacent bands and drive the availability of equipment for FirstNet’s   network, the agency said.

A national public safety network was recommended by a task force that reviewed the Sept. 11, 2001, terror attacks on the U.S.   The Middle Class Tax Relief and Job Creation Act of 2012 called for auctions of other spectrum to cover the cost of the network,   which was estimated last year at US$7 billion.

The public safety network is required to cover 95 percent of the U.S., including all 50 states, the District of Columbia and   U.S. territories. It must reach 98 percent of the country’s population.

Retrieved from NetworkWorld

Is that hotspot safe? Wi-Fi Alliance wants to help with its Passpoint program

Security-savvy mobile-device users are increasingly casting a skeptical eye on public Wi-Fi, and now the vendor consortium behind the wireless standard wants to make logging in via that coffee shop network a bit safer.

The Wi-Fi Alliance’s Passpoint program, based on the Hotspot 2.0 specification, will make public hotspots both safer and easier to use, according to CEO Edgar Figueroa.

“Today, for the most part, when we go on a public hotspot we are sending data without protection. With Passpoint the connections are secure and the communication is encrypted,” Figueroa said.

Also, users should no longer have to search for and choose a network, request the connection to the access point each time and then in many cases re-enter their password. All that can be handled by Passpoint-compatible devices, according to Figueroa.

“The beauty of Passpoint is that the whole industry has agreed to do it this way. More than 70 [devices] have been certified,” he said.

Mobile operators have come to see public Wi-Fi as an important part of their networks as they face growing data usage volumes, and between 60 and 70 big operators are members of the Alliance, Figueroa said.

But he had little to say about the uptake of Passpoint among operators, and would point out only two examples: the first Passpoint-compatible hotspot from Boingo Wireless at Chicago’s O’Hare International Airport, and 30 operators taking part in a trial conducted by the Wireless Broadband Alliance.

Work on an updated Passpoint program has also started. It includes new features for standardized user provisioning. Today everybody signs up users differently, and that makes it harder to implement roaming, according to Figueroa.

The plan is to have the new specification ready next year.

The most obvious problem for current Wi-Fi networks is performance in crowded environments, and Figueroa said the Alliance has addressed that issue with the 802.11ac certification program, which “offers a robust solution that takes you onto 5GHz. So far that band hasn’t been widely used, but ac makes it more compelling,” he said.

The Alliance has also added the new WiGig program, which approves products that operate in the 60 GHz frequency band and offer gigabit speeds over short distances. The technology will be used to connect PCs and portable devices to monitors, projectors and HDTVs wirelessly. Other applications include “instant backups,” according to Figueroa.

“This will be our attempt at making this market go … We have a critical mass of vendors who are investing,” Figueroa said.

Behind the scenes, other areas are also being explored.

“There is a lot of stuff going on around the connected home. There are also a lot of things we are working on for smart grids and there are interest groups looking at health care,” Figueroa said.

Retrieved from ComputerWorld

Six questions to ask your SDN supplier

Let’s face it – real-world deployments of SDN are currently few and far between. While deployments have started to reach beyond the leading-edge Googles and Microsofts of the world, most enterprises and service providers are still in research mode with respect to SDN and the related Open Networking and Network Functions Virtualization (NFV) movements. The good news is that suppliers – even the leading-edge suppliers – are also more SDN researchers and developers than installers and integrators right now.

With all the SDN research and development going on in the industry, there is no shortage of SDN publications and pronouncements. Unfortunately, much of this SDN material is focused on raw technological advancements or unrealized potential benefits. What is missing is the practical guidance for the operator looking to move to SDN sooner (and smoothly) rather than later (and roughly).

How does an operator fill the void between technology and reward? The following questions must be asked – and answered – if an SDN solution is to deliver on its promise:

Question 1 – How do you make it easy to for me to integrate SDN solutions into my current network?

Let’s face it – change is the primary source of network problems. Most studies claim that software-related changes cause most network problems. And here we have SDN, not only representing major change to the network, but major software change to the network. You can bet that the fear, uncertainty, and doubt related to SDN rollouts slow adoption of SDN solutions and, ultimately, the delivery of SDN benefits. Two things go a long way toward reducing the FUD factor relating to SDN deployments.

First, validated working solutions provide solid proof that SDN solutions can be weaved into current infrastructures without exposing a working network to all the risk typically associated with major network alterations. Second, the continued use of existing network designs and devices within a new SDN structure provides comfort from both a technical and financial standpoint. Upgrading the old provides so much more comfort than replacing with new.

Question 2 – How do you promote the development and delivery of SDN applications?

The more advanced SDN solution suppliers have already established strong programs aimed at attracting application developers. However, attracting developers is one thing. Nurturing them is another. How is this different, you ask? Well, APIs and SDKs send notice that an SDN solution is open to third-party development. These tools serve as the initial attraction to developers. To heighten delivery of SDN applications, people and programs must stand behind the products. Here, engineering resources, certification programs, and integration services drive success over the longer term – for the SDN supplier, application developer, and network operator.

Question 3 – How do you ensure that partners provide maximum impact for my network? What role do partners play in your SDN solution and what type of partners are most important to you?

By its very nature, the SDN environment is multivendor. No one vendor will come close to providing all the key pieces of the SDN puzzle. From devices to controllers to applications, SDN is, first and foremost, an integration challenge for all – suppliers and operators. The more successful suppliers will minimize the integration burden for operators. This requires the build-out of a high-impact ecosystem of partners and, most importantly, the execution of programs that serve to validate third-party product interoperability, facilitate the deployment of mixed vendor solutions, and streamline the delivery of support services aimed at day-to-day operations and future enhancements.

Question 4 – How do you help me get started with SDN?

The enterprise or service provider is increasingly sold on SDN. The benefits are just too compelling to ignore. Now comes the hard part – implementation. How do the operators prepare? Where should they focus their deployment efforts first? What are the first key steps to success? Where will they see immediate impact? What are the risks? The more concrete the guidance, the better the result will be.

With that said, bear in mind that suppliers themselves are learning on the fly with SDN, too. Here, the supplier that is willing to commit in-line resources — not just off-line verbal or written guidance — to your preparation, installation, and initial operation is solidifying not only the operator’s specific deployment, but their own SDN technology and techniques.

Question 5 – How do you assure that your SDN solution promotes open networking?

SDN will never deliver on its full promise unless it is enabled by open networking technology. As earlier questions and answers indicate, systems integration, application development, ecosystem management, and rewarding deployments all pose a challenge to SDN adoption. Core to solving all of these challenges is effective use of open networking technology wherever possible. Does this mean that proprietary software and hardware systems are dead in SDN environments? Not at all. Standardized systems will never keep pace with all operator demands. Here, the proprietary system will serve us all well. The key for operators is to leverage all the open networking movement has to offer and choosing proprietary solutions that drive maximum value and accommodate ready replacement or adaptation as standards evolve.

… and a final bonus question:

We’re early on in SDN adoption. Even the early users of SDN have applied the technology in rather specialized ways – e.g. data center traffic management. We’re learning new SDN technologies, techniques, and troubles every day, it seems. With the vast majority of operators very unsure about what to do with SDN, the above questions must be asked and answered before SDN moves into the mainstream. With all the potential benefits of SDN, widespread adoption is inevitable. Solid answers to the above questions make SDN happen sooner rather than later!

Of course, with all that said, every operator should also be asking themselves, “How could SDN help my users, my staff, my organization… now?”

Retrieved from NetworkWorld

British man charged with hacking U.S. military networks

A British man has been arrested in England and charged by the United States and Britain with hacking into U.S. government computer systems, including those run by the military, to steal confidential data and disrupt operations, authorities said.

Lauri Love and three co-conspirators allegedly infiltrated thousands of systems including those of the Pentagon’s Missile Defense Agency, the U.S. Army Corps of Engineers, the U.S. space agency NASA and the U.S. Environmental Protection Agency, according to a U.S. grand jury indictment made public on Monday.

Love, 28, and the unnamed co-conspirators, including two in Australia and one in Sweden, then left “back doors” in the networks to later retrieve data, and intended that their activity “disrupt the operations and infrastructure of the United States government,” according to the indictment.

“Such conduct endangers the security of our country and is an affront to those who serve,” U.S. Attorney Paul Fishman in New Jersey, who announced the charges, said in a statement.

Love was charged in Britain with violating the Computer Misuse Act, and charged in the United States with accessing a U.S. government computer without permission and conspiracy, authorities said.

Fishman said the hacking took place from October 2012 until this month. He said it compromised personal data of U.S. military personnel, and information on defense budgets, contract bidding, and the demolition and disposal of military facilities, and caused millions of dollars of losses.

The arrest comes as authorities worldwide coordinate efforts to combat cybercrime. On October 10, U.S. Defense Secretary Chuck Hagel issued a memorandum emphasizing a need to safeguard even unclassified technical data against cyber intrusions to help protect U.S. military superiority.

“The cyber threat presents a significant risk to national security and military operations,” Pentagon spokesman Lieutenant Colonel Damien Pickart said. “We take this threat seriously and work diligently to prevent future intrusions.”


Love lives in the Suffolk village of Stradishall, about 70 miles northeast of London. He was arrested at his home on October 25 by the cybercrime unit of Britain’s National Crime Agency and other officials, authorities said. He has been released on bail until February 2014, an NCA spokeswoman said.

U.S. prosecutors said the scheme by Love and his co-conspirators involved the installation of malware in the hacked systems, creating “shells” and “back doors” that allowed them to return later to steal data.

The indictment described how Love, who was also known as “nsh” and “route” and “peace,” at times allegedly used internet chat rooms to discuss the hacking and efforts to conceal it.

In an October 2012 conversation described in the indictment, Love discussed the hacking of an Army Corps database that might have yielded 400,000 email addresses, and asked a co-conspirator to “grab one email for curiosity.”

Nine months later, in July 2013, he appeared to boast about accessing a NASA database, telling another co-conspirator “ahaha, we owning lots of nasa sites,” the indictment said.

Later that month, he told the same co-conspirator after another hacking: “This … stuff is really sensitive. … It’s basically every piece of information you’d need to do full identity theft on any employee or contractor for the (agency),” according to the indictment.

Prosecutors said hacked systems were located in places including Vicksburg, Mississippi, and the U.S. Army’s Aberdeen Proving Ground in Maryland, and also included a server containing information about military personnel at Fort Monmouth in New Jersey.

Love faces up to five years in prison and a fine on each U.S. criminal count. Prosecutors said he faces additional charges in federal court in Alexandria, Virginia, stemming from other unspecified “intrusions.”

Retrieved from reuters goes down after ‘error during scheduled update’

The USA’s National Security Agency (NSA), lately the source of near-endless controversy for spying on just about the entire internet, has itself hit trouble online after its website went down.

The agency has ‘fessed up to some website wobbles last Friday, but has issued a statement to all and sundry that says “an internal error that occurred during a scheduled update” was the source of the outage. The statement went on to say “The issue will be resolved this evening. Claims that the outage was caused by a distributed denial of service attack are not true.”

The last statement looks correct: online agitators aren’t claiming to have taken down the agency’s site and are instead making light of the situation with Tweets such as the one below.

Anonymous @AnonyOps -Aww don’t panic about  being down. They have a backup copy of the internet.

Retrieved from TheRegister