Category Archives: Information Technology

IBM’s government cloud gets FedRAMP approval

IBM’s SmartCloud for Government now has the U.S. government-sanctioned FedRAMP approval that should make it easier for IBM to sell cloud technologies into multiple government agencies.

All cloud providers wanting to provide IT infrastructure under government contracts must get “Federal Risk and Authorization Management Program” certified by June 2014. Amazon Web Services, which is increasingly seen as a big rival to IBM — especially after beating IBM to win the CIA cloud contract – got its approval in May. Microsoft Windows Azure got its accreditation in September.

The SmartCloud for Government does not yet incorporate technology from SoftLayer, the cloud provider IBM bought for about $2 billion in June. An IBM spokesman said the company is preparing a SoftLayer government cloud which should be available early next year will “achieve all the necessary security requirements as well.” IBM recently notified users of its SmartCloud Enterprise product that they will be transitioned over to SoftLayer, according to a VentureBeat report.

Other accredited providers are Akamai, AT&T, Autonomic Resources, CGI Federal, Hewlett-Packard and Lockheed Martin.

The stakes are huge. The U.S. government’s Cloud First initiative is pushing agencies to deploy more IT on cloud as a cost-saving and efficiency-boosting measure. That means tens of billions in spending over the next few years. After next year’s deadline, vendors have to be approved to compete for those jobs.

Retrieved from GIGAOM

5 disturbing BYOD lessons from the FAA’s in-flight electronics announcement

Last week, the Federal Aviation Administration (FAA) moved to let passengers use their smartphones and tablets during airline take-offs and landings. At first glance, this seems like a victory for reasonableness, productivity, and looking out for the rights of technology end users. Even The New York Times said “the  agency won unusually broad praise from pilots, flight attendants and members of Congress, along with passengers.”

But a closer look reveals that the FAA has in fact unwittingly written a guide of what NOT to do when creating a Bring Your Own Device (BYOD) policy – which is essentially what this is. The FAA’s policy may be a step in the right direction for fliers, but it remains plagued with vague instructions, unsupported reasoning, and painfully convoluted processes. Smart IT departments can learn some useful lessons about as they wrestle managing how their users are supposed to work with various devices and access corporate networks.

1. It’s confusing. Any techie worth his pocket protector knows that workable policies have to be clear to everyone. How can users do what they’re supposed to do if they can’t even figure out what they’re supposed to do? And the FAA’s policy has so many caveats, exceptions, and implementation variables that even flight attendants don’t have a clue about what’s acceptable – much less passengers.

2. It’s unenforceable. Say what you will about the old rules, at least you could tell from the aisle whether someone was using a banned device. But you can’t tell if a device is in airplane mode without making the user hand it over – which isn’t supposed to be part of the new rules. A good thing, too: Can you imagine a corporate BYOD policy that let IT folks demand execs hand over their smartphone like policemen asking for your papers?

3. Its logical underpinnings are suspect. Banning the use of portable electronics during take off and landing was based on the possibility that they could cause interference with airplane navigation systems. It turned out there was little evidence of that. But the FAA still says cellphone calls could disrupt radio communications, and the new approach actually makes it more likely for that to happen. If these things really are dangerous – and all the exceptions seem to indicate that they could be in certain cases – then why are we so eager to let people do this?

4. It throws ultimate responsibility on to the users. Instead of setting a single policy, the FAA is now requiring every airline to get an individual safety certification for each type of airplane it flies. Planes that may be more vulnerable to radio interference may have different rules. Oh, and if there’s bad weather and low visibility (estimated to be about 1% of the time), the airlines may be required to make passengers shut down their devices. That’s right?

5. It undermines respect for other rules. This whole thing is a complete debacle, from the patently ridiculous old rules to the confusing, illogical, and unenforceable new ones that airlines are required to interpret on the fly. It all adds up to making the FAA and the airlines look stupid and out of touch, and erodes passengers willingness to follow other – presumably more important – regulations. And that could have truly disastrous consequences.


Retrieved from NetworkWorld

‘War Room’ notes describe IT chaos at

WASHINGTON — On the morning of Oct. 1 in Washington, temperatures in the low 80s were expected, the Republican-engineered federal shutdown was in its first day, and a “War Room” team gathered for a meeting. They kept notes.

Many federal offices were empty that day due to the shutdown-caused furloughs of federal employees. But Oct. 1 was also the day of the launch of the Affordable Care Act’s Website, the main portal to sign up for insurance under the new law. Trouble tickets quickly piled up, and wait times for help desk responses grew to as much as five hours.

At some points in the days immediately following the launch, there were 40,000 people in virtual “waiting rooms” because capacity had been reached. Some were waiting 15 to 20 minutes in these rooms.

The War Room notes, 175 pages in all, were released Monday by U.S. Rep. Darrell Issa (R-Calif.), who chairs the House Oversight and Government Reform Committee. (PDF War Room notes) Issa, a critic of the health care law, is using the notes to draw attention to the limited number of insurance sign-ups so far. Just six people signed up on the first day.

The War Room notes also catalog IT problems — dashboards weren’t showing data, servers didn’t have the right production data, third party systems weren’t connecting to verify data, a key contractor had trouble logging on, and there wasn’t enough server capacity to handle the traffic, or enough people on the help desks to answer calls. To top it off, some personnel needed for the effort were furloughed because of the shutdown.

One note posed this question: “Given the computer system issues, should we be saying that paper is better for now?” That course was never taken.

The War Room notes don’t reflect any of the frustration, worry or anger that might have been present. They simply lay out the plan’s action items, issues and challenges concisely.

There’s no reaction, for instance, to President Obama’s criticism on Oct. 21 when he said that the website “has been too slow, people have been getting stuck during the application process.”

At the onset, the federal officials were searching for a root cause of the capacity issues, and seemed to fix much blame on the enterprise identity management system, which they described as a choke point. There are also multiple references to increasing server capacity.

The team did decide to add “consumer-friendly” messages for customers caught up in the online crush.

As the extent of the availability and performance problems became ever clearer, managers were lining up volunteers for weekend work. Many federal employees, even though they were deemed essential, weren’t being paid during the shutdown. But there were bureaucratic complications. For instance, according to one note: “Donna’s comp time approver is furloughed.”

Another note also addressed the furlough: “Casework team was furloughed yesterday, but called back in today.”

In the initial rollout, a needed service from the Dept. of Veterans Affairs site was down, and another system used for income verification, was “creating confusion with credit check information.” Help desks assigned to various issues were quickly expanded.

By week two of the rollout, “about 60% of applicants are getting into without sitting in the waiting room, up from 5-10% earlier last week,” the notes said. Capacity improvements were working. “Additional servers are making it easier to get in,” according to the notes. Federal officials say they will have all problems resolved by the end of this month.

Retrieved from ComputerWorld

Infosys ran ‘unlawful’ visa scheme, U.S. alleges in settlement

In a settlement announced today, the U.S. government alleges that offshore outsourcing giant Infosys violated visa laws to increase its profits, reduce visa expenses and avoid tax liabilities.

But the allegations, and the evidence the government says it has to back them up, will remain just that — allegations. Instead of pursuing a court case, the U.S. will accept a $34 million settlement payment from Infosys.

That’s how the government typically handles visa violation claims, but in this instance, the government said it is the largest settlement ever reached.

Infosys denies any wrongdoing in the agreement, which is signed by both parties.

This settlement figure represents a fraction of the $6.99 billion in 2012 revenue generated by the Bangalore, India-based IT service provider.

The government action addresses the company’s use of the B-1, or business visitor visas, for work that allegedly required H-1B visas.

Generally, a B-1 visa is intended for short-term visits to the U.S., such as trips to attend conferences or negotiate contracts. B-1 visas are relatively easy to get and aren’t subject to any of the caps, fees, or wage requirements that govern use if H-1B visas.

Infosys, which applies for thousands of H-1B visas every year, “unlawfully” supplemented its workforce with B-1 visa workers, according to the U.S. complaint, which alleged that Infosys wrote letters to U.S. officials with “false representations regarding the true purpose” of a B-1 worker’s activities. For example, the company would tell officials that a worker was coming in for discussions, when the real reason was to do work that required an H-1B visa, such as programming. Visa applicants were also told what to say to avoid suspicion, according to the government.

“We will not tolerate actions that mislead the United States and circumvent lawful immigration processes, whether undertaken by a single individual or one of the largest corporations in the world,” said John Bales, U.S. attorney for the Eastern District of Texas, whose office conducted the investigation. “The H-1B and B-1 visa programs are designed and intended to protect the American worker, and we will vigorously enforce the requirements of those programs,” he said in a statement.

In the settlement, Infosys denies any wrongdoing and says its B-1 visa use “was for legitimate business purposes and not in any way [designed] to circumvent the requirements of the H-1B program.”

The company reiterated that position in a statement Wednesday and said it “disputes any claims of systemic visa fraud” and other claims, arguing that such claims are “untrue and are assertions that remain unproven.”

Infosys says that “only 0.02% of the days Infosys employees worked on U.S. projects in 2012 were performed by B-1 visa holders.” But that’s a reference to visa use in the year after the government launched its probe, which started in 2011.  Infosys has not disclosed the size of its U.S. workforce, or the percentage of its workforce on visas.

The person who triggered the investigation was Jay Palmer, an Infosys employee and Alabama resident who filed a lawsuit against the outsourcing company that drew attention to its use of the B-1 visa.

Palmer said that after he refused to participate in a plan to use B-1 visa workers for jobs requiring H-1B workers, he was threatened and harassed. Palmer’s case in federal court didn’t make it to trial because of provisions in Alabama’s “at-will” employment law. In dismissing the case, U.S. District Court Judge Myron Thompson  said he couldn’t rewrite state laws but nonetheless made it clear that the alleged threats against Palmer were “deeply troubling.” The case didn’t touch on visa law, which is what a Texas grand jury was looking into.

Palmer’s attorney, Kenneth Mendelsohn, said Palmer “was the guy who had the courage to stand up.

“There were many people in Infosys that knew this was going on and just turned the other cheek,” said Mendelsohn. “Jay just morally could not do that, even at the risk of harassment and threats.”

Palmer gave credit to the federal investigators. “Today is not about me, today is about Ed Koranda, and Tim Forte and the U.S. government and their findings and enforcement of the law,” said Palmer, in an interview, referring to the two special agents who investigated the case.

Palmer also said that he “harbors no hard feelings” toward Infosys Executive Chairman Narayana Murthy.  “I only wish he would have reached out to me over the last 2.5 years,” he said.

“What people don’t understand is I tried to fix the problem before I got an attorney, before I turned them in,” Palmer said.

The settlement doesn’t affect Infosys’ ability to obtain visas in the future. But it’s unclear whether this agreement is the end of the company’s problems.

The government’s settlement said that Infosys circumvented visa law “for the purposes of increasing profits, minimizing costs of securing visas, increasing flexibility of employee movement, obtaining an unfair advantage over competitors, and avoiding tax liabilities.” These allegations could invite a closer look from the Internal Revenue Service or the U.S. Securities and Exchange Commission.

Infosys, as part of the settlement, also agreed to improve its visa compliance procedures.

Ron Hira, a public policy professor at the Rochester Institute of Technology and a researcher on tech immigration issues, points out that the $34 million settlement represents a “mere 2% of Infosys profits of $1.7 billion last year.”

“Hopefully, policymakers and journalists don’t draw the conclusion that the ‘system works’ because Infosys has settled,” Hira said. “Instead, they should see this for what it is — one small indication of the vast extent to which firms are exploiting loopholes in the visa programs to bring in cheaper foreign workers to displace and undercut American workers.”

John Miano, founder of the Programmers Guild, said “it is great to see the government finally doing something.” He contends that the use of the B-1 visa to import labor is widespread.

“Until the government starts seeking criminal actions against individuals, illegal actions like this will continue,” Miano said.

Retrieved from ComputerWorld


What’s holding back the cloud industry?

While cloud enthusiasts roaming the halls of McCormick Place convention hall in Chicago last week at Cloud Connect may be   high on the market, the reality is that many enterprises IT shops are still reticent to fully embrace public cloud computing.

Network World asked some of the best and brightest minds in the industry who were at the event about what’s holding the cloud   industry back. Here’s what they said:

The organization Eric Hanselman, Chief Analyst, 451 Research Group

Cloud sounds like a great idea, but how will it really work when it’s adopted? Hanselman says one of the biggest barriers   is an organizational one. Typically IT organizations are split into groups focusing on compute, network and storage. When   applications run from the cloud, those are all managed from one provider. That means the jobs from each of those groups within   IT may change. How can organizations evolve? “You’ve got to converge,” Hansleman says. That could be easier said than done   with people’s jobs at stake.

cloud computing

Cloud industry thought leaders (from left): Eric Hanselman, Krishnan Subramanian, Randy Bias, Bernard Golden, Andy Knosp

Security and application integration Krishnan Subramanian, director of OpenShift Strategy at Red Hat; founder of Rishidot Research

Security is still the biggest concern that enterprises point to with the cloud. Is that justified? Cloud providers spend a   lot of money and resources to keep their services secure, but Subramanian says it’s almost an instinctual reaction that IT   pros be concerned about cloud security. “Part of that is lack of education” he says. Vendors could be more forthcoming with   the architecture of their cloud platforms and the security around it. But doing so isn’t an easy decision for IaaS providers:   Vendors don’t want to give away the trade secrets of how their cloud is run, yet they need to provide enough detail to assuage   enterprise concerns.

Once IT shops get beyond the perceived security risks, integrating the cloud with legacy systems is their biggest technical   challenge, Subramanian says. It’s still just not worth it for organizations to completely rewrite their applications to run   them in the cloud. Companies have on-premises options for managing their IT resources and there just isn’t a compelling enough   reason yet to migrate them to the cloud. Perhaps new applications and initiatives will be born in the cloud, but that presents   challenges around the connections between the premises and the cloud, and related latency issues.

New apps for a new computing model Randy Bias, CTO of OpenStack company Cloudscaling

If you’re using cloud computing to deliver legacy enterprise applications, you’re doing it wrong, Bias says. Cloud computing   is fundamentally a paradigm shift, similar to the progression from mainframes to client-server computing. Organizations shouldn’t   run their traditional client-server apps in this cloud world. “Cloud is about net new apps that deliver new business value,”    he says. “That’s what Amazon has driven, and that’s the power of the cloud.” Organizations need to be forward thinking enough   and willing to embrace these new applications that are fueled by big data and distributed systems to produce analytics-based   decision making and agile computing environment.

It’s more than just technology Bernard Golden, VP of Enterprise Solutions for Enstratius, a Dell company

The biggest inhibitor to more prevalent cloud computing adoption is that organizations are still holding on to their legacy   processes, says Golden, who recently authored the Amazon Web Services for Dummies book. It’s not just about being willing   to use new big data apps, and spin up virtual machines quickly. It’s the new skill sets for employees, technical challenges   around integrating an outsourced environment with the current platform, and building a relationship with a new vendor. “For   people to go beyond just a small tweak, there needs to be a significant transformation in many areas of the organization,”    he says. “Each time there is a platform shift, established mechanisms are forced to evolve.”

Regulatory compliance Andy Knosp, VP of Product for open source private cloud platform Eucalyptus

One of the biggest hurdles for broader adoption of public cloud computing resources continues to be the regulatory and compliance   issues that customers need to overcome, Knosp says. Even if providers are accredited to handle sensitive financial, health   or other types of information, there is “still enough doubt” by executives in many of these industries about using public   cloud resources. Many organizations, therefore, have started with low-risk, less mission critical workloads being deployed   to the public cloud. Knosp says the comfort level for using cloud resources for more mission critical workloads will grow.   It will just take time.

Retrieved from NetworkWorld

FCC lays down spectrum rules for national first-responder network

The U.S. moved one step closer to having a unified public safety network on Monday when the Federal Communications Commission   approved the rules for using spectrum set aside for the system.

Also on Monday, the agency directed its Office of Engineering and Technology to start processing applications from vendors   to have their equipment certified to operate in that spectrum.

The national network, which will operate in the prized 700MHz band, is intended to replace a patchwork of systems used by about 60,000 public safety agencies around the country. The First Responder Network Authority   (FirstNet) would operate the system and deliver services on it to those agencies. The move is intended to enable better coordination   among first responders and give them more bandwidth for transmitting video and other rich data types.

The rules approved by the FCC include power limits and other technical parameters for operating in the band. Locking them   down should help prevent harmful interference with users in adjacent bands and drive the availability of equipment for FirstNet’s   network, the agency said.

A national public safety network was recommended by a task force that reviewed the Sept. 11, 2001, terror attacks on the U.S.   The Middle Class Tax Relief and Job Creation Act of 2012 called for auctions of other spectrum to cover the cost of the network,   which was estimated last year at US$7 billion.

The public safety network is required to cover 95 percent of the U.S., including all 50 states, the District of Columbia and   U.S. territories. It must reach 98 percent of the country’s population.

Retrieved from NetworkWorld

Six questions to ask your SDN supplier

Let’s face it – real-world deployments of SDN are currently few and far between. While deployments have started to reach beyond the leading-edge Googles and Microsofts of the world, most enterprises and service providers are still in research mode with respect to SDN and the related Open Networking and Network Functions Virtualization (NFV) movements. The good news is that suppliers – even the leading-edge suppliers – are also more SDN researchers and developers than installers and integrators right now.

With all the SDN research and development going on in the industry, there is no shortage of SDN publications and pronouncements. Unfortunately, much of this SDN material is focused on raw technological advancements or unrealized potential benefits. What is missing is the practical guidance for the operator looking to move to SDN sooner (and smoothly) rather than later (and roughly).

How does an operator fill the void between technology and reward? The following questions must be asked – and answered – if an SDN solution is to deliver on its promise:

Question 1 – How do you make it easy to for me to integrate SDN solutions into my current network?

Let’s face it – change is the primary source of network problems. Most studies claim that software-related changes cause most network problems. And here we have SDN, not only representing major change to the network, but major software change to the network. You can bet that the fear, uncertainty, and doubt related to SDN rollouts slow adoption of SDN solutions and, ultimately, the delivery of SDN benefits. Two things go a long way toward reducing the FUD factor relating to SDN deployments.

First, validated working solutions provide solid proof that SDN solutions can be weaved into current infrastructures without exposing a working network to all the risk typically associated with major network alterations. Second, the continued use of existing network designs and devices within a new SDN structure provides comfort from both a technical and financial standpoint. Upgrading the old provides so much more comfort than replacing with new.

Question 2 – How do you promote the development and delivery of SDN applications?

The more advanced SDN solution suppliers have already established strong programs aimed at attracting application developers. However, attracting developers is one thing. Nurturing them is another. How is this different, you ask? Well, APIs and SDKs send notice that an SDN solution is open to third-party development. These tools serve as the initial attraction to developers. To heighten delivery of SDN applications, people and programs must stand behind the products. Here, engineering resources, certification programs, and integration services drive success over the longer term – for the SDN supplier, application developer, and network operator.

Question 3 – How do you ensure that partners provide maximum impact for my network? What role do partners play in your SDN solution and what type of partners are most important to you?

By its very nature, the SDN environment is multivendor. No one vendor will come close to providing all the key pieces of the SDN puzzle. From devices to controllers to applications, SDN is, first and foremost, an integration challenge for all – suppliers and operators. The more successful suppliers will minimize the integration burden for operators. This requires the build-out of a high-impact ecosystem of partners and, most importantly, the execution of programs that serve to validate third-party product interoperability, facilitate the deployment of mixed vendor solutions, and streamline the delivery of support services aimed at day-to-day operations and future enhancements.

Question 4 – How do you help me get started with SDN?

The enterprise or service provider is increasingly sold on SDN. The benefits are just too compelling to ignore. Now comes the hard part – implementation. How do the operators prepare? Where should they focus their deployment efforts first? What are the first key steps to success? Where will they see immediate impact? What are the risks? The more concrete the guidance, the better the result will be.

With that said, bear in mind that suppliers themselves are learning on the fly with SDN, too. Here, the supplier that is willing to commit in-line resources — not just off-line verbal or written guidance — to your preparation, installation, and initial operation is solidifying not only the operator’s specific deployment, but their own SDN technology and techniques.

Question 5 – How do you assure that your SDN solution promotes open networking?

SDN will never deliver on its full promise unless it is enabled by open networking technology. As earlier questions and answers indicate, systems integration, application development, ecosystem management, and rewarding deployments all pose a challenge to SDN adoption. Core to solving all of these challenges is effective use of open networking technology wherever possible. Does this mean that proprietary software and hardware systems are dead in SDN environments? Not at all. Standardized systems will never keep pace with all operator demands. Here, the proprietary system will serve us all well. The key for operators is to leverage all the open networking movement has to offer and choosing proprietary solutions that drive maximum value and accommodate ready replacement or adaptation as standards evolve.

… and a final bonus question:

We’re early on in SDN adoption. Even the early users of SDN have applied the technology in rather specialized ways – e.g. data center traffic management. We’re learning new SDN technologies, techniques, and troubles every day, it seems. With the vast majority of operators very unsure about what to do with SDN, the above questions must be asked and answered before SDN moves into the mainstream. With all the potential benefits of SDN, widespread adoption is inevitable. Solid answers to the above questions make SDN happen sooner rather than later!

Of course, with all that said, every operator should also be asking themselves, “How could SDN help my users, my staff, my organization… now?”

Retrieved from NetworkWorld

Paul Ryan and Elizabeth Warren will now take your petitions

When a handful of Denver residents came together this year to demand a protected bike lane downtown, their most effective weapon wasn’t a visit to city hall or a coordinated telephone campaign. It was an online petition — 834 signatures that inspired two city council members and the Denver Public Works commission, to publicly back their cause.

“Let me know what else I need to do to help this,” councilwoman Susan Shepherd wrote back, right on the petition’s Web site.

The experience encouraged, the platform behind the petition, to tie leaders more closely to those they represent. The organization seems serious about recruiting powerful people. On Wednesday, rolled out an upgrade called Decision Makers, which features profile pages for members of Congress that collect all of their public responses. Yep, that’s right: Your elected officials will now be responding directly to your petitions.

Rep. Paul Ryan (R-Wis.) was among the first to sign up.

“ is going to be a big help,” said Ryan, chair of the House Budget Committee. “It will be a transparent, public forum where we can talk with our constituents.”

Others who’ve pledged to join the program include Sen. Elizabeth Warren (D-Mass.), Rep. Henry Waxman (D-Calif.) and Rep. Mike Honda (D-Calif.). The push to include lawmakers will soon lead to a broader focus on city officials, governors, business executives and others. One mayor, San Francisco’s Ed Lee, has indicated he’ll be responding to petitioners, too.

We the People, run by the people

Targets of a petition currently get notified when they receive a request, and when new signatures are added. But for the most part, it’s been a one-way relationship. Now the lines of communication will go both ways, with opportunities for a petition creator to keep rallying regardless of the response. In an interview,’s Jake Brewer said the system works much like the White House’s We the People petition site — but with some key differences. For one thing, while decision makers have access to the platform to log their responses, they don’t own it or set the rules. That’s a subtle dig at the Obama administration, which has on occasion decided to ignore petitions that crossed the signature threshold required for an official response.

That said, what also sets’s Decision Makers program apart from We the People is that it won’t include an explicit signature threshold. Brewer said simply makes recommendations for where to draw the line depending on a constituency’s population size and geography. In other words, it’s the civic process that matters — not the outcome.

“We need people’s voices to outweigh money in politics,” Brewer said.

Do online petitions really matter?

In its basic mechanics, the online petition of 2013 isn’t so far removed from its offline cousin. But in the way it’s socially regarded, the Internet petition has become a prominent symbol for 21st-century politics. Participating in a movement has never been easier, a fact that has led critics to dismiss the online petition as so much idle slacktivism.

Yes, it’s far easier to put your name on a petition than it is to stage a physical sit-in. At the same time, managing a petition — and doing it well — can be deceptively hard. At the national level, it takes 100,000 signatures to warrant a White House reply. (To be fair, the administration has had to raise the threshold over time because hitting the mark grew easier as the service became more popular. But think of it this way: Only a fraction of White House petitions have ever produced a substantial policy change.)

“A hundred signatures on a petition to a school board is going to have much greater impact than — I mean, you’re going to need to get 50,000 or 100,000 for a Paul Ryan or a Liz Warren before they even begin to take notice,” said Evan Sutton, a spokesperson for the New Organizing Institute, a left-leaning think tank for digital politics.

In short, there’s an inverse relationship between the importance of a decision maker and the work it’ll take to get them to listen, much less act.

The balance of power

If the bar for results is set so high, then who benefits more? Constituents, who through now enjoy greater access to elected officials? Or lawmakers, who by volunteering to respond can offer the promise of engagement without really committing themselves to anything?

In that respect, online petitions are really no different from constituent mail or phone calls, which can just as easily be written off. Where they might make a big difference, however, is in the way lawmakers talk to themselves and each other.

On the one hand, petitions can help officials make better choices. They’re not only an indication of what voters care about; they’re also a clue as to how strongly those voters feel. Dueling petitions give lawmakers insight into which position is more popular — or perhaps simply more organized. On the other hand, petitions also tend to activate the poles of the electorate. Lawmakers who are made to feel more accountable to them may wind up exacerbating partisanship. They might even marshal extreme petitions as evidence for their political agendas.

Still, the fact that petitions tend to be more effective in a local setting suggests that, on balance, when decision makers refer to a petition as a way to back themselves up, the effects will go toward producing results rather than talking points. Incidentally, that’s precisely what happened when Denver councilwoman Susan Shepherd signed onto the bike lane proposal. Within a week of Shepherd’s announcement, two other public officials had addressed the issue. The petition had become a collaborative policy initiative.

“I think that’s when it really starts to distinguish itself from a petition that goes at people,” said Brewer. “It actually starts to transition to, ‘Okay, the next step of this solution is that other people need to be involved. Let’s go involve them.’ ”

Retrieved from WashingtonPost

China’s Alibaba to expand U.S. reach with new investment group

China’s Alibaba Group is poised to invest more in U.S. tech companies with the start of a new investment group that the e-commerce   giant is setting up in San Francisco .

Alibaba is looking to back “innovative platforms, products, and ideas” that focus on e-commerce and new technologies with   the investment group, the company said in an email Wednesday.

The company recently invested in three U.S. tech companies, the latest being ShopRunner, an online retailer that competes against Alibaba led a recent investment round for ShopRunner that raised US$200   million.

Earlier in the year, the company also funded Quixey, a search engine for mobile apps, and Fanatics, a retailer of licensed   sports merchandise.

The U.S. market and Silicon Valley have talent and expertise the Chinese e-commerce company wants to tap into, said Mark Natkin,   managing director for Beijing-based Marbridge Consulting. At the same time, Alibaba has ambitions to become more international.   Its investments in the U.S. could lay the groundwork for an eventual expansion into the country’s market, Natkin added.

“Its often more effective, more cost efficient, to acquire a company that already has demonstrated success in the area you   are trying to expand in,” he said.

While not as well known in the U.S., Alibaba reigns as the largest e-commerce company in its home market. The company established   Tmall and Taobao, two of the country’s most popular online retail sites.

In the U.S., the company has a smaller presence with its wholesale supplier sites, and AliExpress, which sells   products to businesses and even consumers across the world.

Alibaba could also decide to list on a U.S. stock exchange, with an initial public offering some reports have estimated could   value the company at over $100 billion.

Retrieved from NetworkWorld

Controversial cyberthreat bill CISPA may return to Congress


After suffering defeat this spring, the controversial legislation aimed at preventing cyberthreats, CISPA, may be returning to the Senate. According to Mother Jones, two senators are now working on a new version of the bill that looks to curb some of the concerns that kept it from initially passing. The goal of the bill will still be to make it easier for private companies to share information with the government regarding cyber threats, however the type of information that can be shared will reportedly be narrower in scope this time around.

The bill won’t target American’s communications

As the legislation is still being written, it’s not clear exactly how different its updated form will be. Mother Jones reports that Senators Dianne Feinstein (D-CA) and Saxby Chambliss (R-GA) are working together to draft the bill. “The goal is to allow and encourage the sharing only of information related to identifying and protecting against cyberthreats, and not the communications and commerce of Americans,” Feinstein’s office tells Mother Jones in a statement. Feinstein in particular has been a major proponent for facilitating this type of sharing, having also been in support of expanding FISA.

In light of the NSA leaks, Mother Jones suggests that so many companies may have initially stood in support of CISPA — the Cyber Intelligence Sharing and Protection Act — because it could have granted them protections for handing over information as part of PRISM. But those leaks should only make a reintroduction of CISPA, however limited, all the more disconcerting for privacy advocates. The bill was even called for earlier this month by NSA director General Keith Alexander, who said that legislation must be put in place before the US was hit with a cyberattack. But it’s only become more evident since CISPA was defeated how widely the NSA is able to access American citizens’ information as it is, and a new bill would only expand those abilities.


Retrieved from TheVerge