Monthly Archives: June 2012

Wickr mobile privacy app sweeps digital crumbs away

Users can set a time period for how long a recipient can view a message before it becomes unrecoverable, according to Wickr’s creators

A new mobile application for Apple devices called Wickr lets people exchange files and messages without leaving digital traces that could be examined by law enforcement or cyber spies.

Wickr, released on Wednesday, addresses the raft of privacy concerns that arise when a person sends a sensitive message: email providers, ISPs, mobile phone companies and social networking sites all retain detailed records of activity on their networks.

Those records could be requested by law enforcement or accessed potentially by other people with ill intentions. San Francisco-based Wickr offers a system that is based on heavy encryption, no log files and a robust data destruction system to ensure data stays secret forever.

Senders of a message or photo can set a self-destruct time for the data ranging from a few seconds to six days in the free version of Wickr. As soon as the recipient who has Wickr installed opens the message, the countdown begins.

“No matter what can do, you cannot stop the clock,” said Robert Statica, an information technology professor at New Jersey Institute of Technology, who cofounded Wickr with Nico Sell, Christopher Howell and Kara Coppa.

Wickr makes it hard for a person to take a screenshot of a photo or video: the recipient has to hold down a “button” on the screen, and if a fingertip moves more than a couple of pixels, the data disappears, Statica said. To take a screenshot on an iPhone, a person must push the power button and home button at the same time.

Once the time period has expired, Wickr writes over the photo or file in the device’s memory with random data. This is important since computers and other devices don’t immediately erase data that has been tagged as garbage. Using special computer forensics software, the data can often be recovered.

“The operating system reports that the file has been deleted but in fact the file remains on the hard drive on the device until it is overwritten,” Statica said.

Before transmission, text and photos are scrambled on the device using 256-bit AES (Advanced Encryption Standard) encryption. The encryption keys are also encrypted and only used once before being discarded. Wickr doesn’t have access to any of the encryption keys used for securing data.

Even a person’s user name is stored by Wickr as a cryptographic cipher. “We don’t know who you are,” Statica said.

As an added security measure, data is sent using SSL (Secure Sockets Layer), an encrypted security protocol. Only encrypted data passes through Wickr’s servers, and log files are deleted. Statica said no information is retained by Wickr about what files users are sending and to whom.

The only real way to see something sent to a Wickr user would be to steal the person’s phone. Even then, five wrong attempts at the password will cause Wickr to erase itself.

Wickr also tackles the privacy problems concerning metadata, or information about a file or photograph that is often included as part of the default settings of an application. Metadata can reveal more information than perhaps the person who took the photo or sent the file really wants to share.

Cameras, for example, will often include data such as GPS information, times and dates. Word processing programs can note who has looked or edited a document and the filer server where it has been stored. Wickr scrubs files of metadata.

So far, Wickr has been poked and prodded by well-known computer security pros, said cofounder Nico Sell. “I’ve had a number of my hacker friends break it,” she said. “They fixed a lot of things.”

But Wickr is ready to go: A free version is available Apple’s App Store, and an Android application under development. A paid-for premium version of Wickr is in the works that will let users buy specific features, such as extending the time period before data is deleted. “We are always planning to do more,” Sell said.

Retrieved from InfoWorld

No need to comply with data laws if it’s too difficult – EU ministers

If you don’t know data is personal, maybe it isn’t?

Organisations will not have to abide by data protection laws if it would be too difficult, time-consuming and use up too many important resources to check whether information they hold is personally identifiable, the EU’s Council of Ministers has proposed.

The Council has outlined some revisions (112-page/575KB PDF) to the European Commission’s draft General Data Protection Regulation that was originally published in January. Under its proposals, published by information law experts Amberhawk Training, the Council has suggested amending the definition of ‘personal data’ as well as a recital outlining the scope of when the laws in Regulation would apply to information.

Both the revised definition and recital contain new proposals that would see information declared not personally identifiable if it was too burdensome to assess whether it actually is.

The proposed definition states:

‘Personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, by means reasonably likely to be used by the controller or by any other natural or legal person, in particular by reference to a name, an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person. If identification requires a disproportionate amount of time, effort or material resources the natural living person shall not be considered identifiable.

The scope of data protection should only apply to “information concerning an identified or identifiable natural person” and “account should be taken of all the means reasonably likely to be used either by the controller or by any other person to identify the individual, unless this would involve a disproportionate effort in terms of time or technical or financial resources,” according to the Council’s proposed recital.

Market researchers, online ad firms won’t benefit

Information law expert, Marc Dautlich of Pinsent Masons, the law firm behind Out-Law.com, said that the proposed amendments do not adequately assist industries whose focus is research and those whose business models are not really focused on the individual.

“Market research companies and academic organisations which really aren’t focused on the individual but on research and, perhaps more controversially, the online advertising industry struggle with this binary distinction between personal and non-personal data,” said Dautlich. “So-called indirectly identifiable data manifests itself online in IP addresses, in cookies and in other forms and it is difficult to follow how in practice this disproportionate effort test would help alleviate the problem.”

Anonymised information that does not allow individuals to be identified and information about dead people should be outside the scope of the data protection law, the Council proposed.

The UK had failed with a suggestion to limit the application of data protection laws under the Regulation to circumstances where a person is “easily identifiable”, the Council’s draft revisions document said.

The European Commission’s draft General Data Protection Regulation was one of two legislative texts the Commission proposed and, if enforced, would introduce a single data protection law across all 27 EU member states. Companies that process personal data of EU citizens from outside the borders of the trading bloc would also be subject to the rules.

However, the presidency of the Council of Ministers detailed proposed revisions to the recitals and first ten articles of the draft Regulation following comments by individual EU member states. The revisions document sets out that there is opposition to the formation of a Regulation at all by some countries in the trading bloc. However, the terms of the Regulation were also heavily contested and resulted in proposals to alter the rules around consent to personal data processing.

Under existing EU data protection laws, and the original draft reforms, obtaining consent from individuals is one way for organisations to obtain a lawful ground to process personal information.

Under the planned reforms organisations operating in the EU would generally have to obtain explicit, freely given, specific and informed consent from individuals in order to be able to lawfully process their personal data, while that consent will not be able to be gleaned through silence or inactivity on the part of individuals. Instead it would have to be obtained through a statement or “clear affirmative action” before it can be said to have been given.

However, the Council reported that some EU member states, including the UK, had raised concerns that requiring consent to be explicit, freely given, specific and informed was “unrealistic” and had “queried its added value.”

Consent can be ‘implied’

Germany had wanted “conditions for electronic consent” to be set out in the text, while the Czech Republic said consent should only be said to have been given if it was “provable” rather than “explicitly” given. However, the European Commission said the Czech Republic’s suggestion did not account for the possibility that consent can be implied and said that there were already provisions in the Regulation to ensure that “consent should not be unnecessarily disruptive to the use of the service for which it is provided.”

Under the Council of Ministers’ plans the “burden of proof” will be on the data controlling organisation to show that they have achieved legitimate consent to processing. The UK said this proposed rule would “put a heavy regulatory burden on companies.” The plans also account for the rights of individuals to withdraw their consent at any time, but revisions have been proposed which state that the withdrawal does not affect the lawfulness of processing based on consent prior to the withdrawal and that “nor shall it affect the lawfulness of processing of data based on other grounds.”

The Council also outlined plans to expand on the number of lawful grounds that can be relied upon to justify personal data processing as an alternative to obtaining consent. Among the proposals are plans to expand on the right of data controllers to process personal data if their “legitimate interests” do not outweigh the fundamental rights of individuals concerned. Under the Council’s draft revisions, the “legitimate interests” of third parties could also be considered as grounds to justify personal data processing where to do so would not interfere with the fundamental rights of individuals.

Further proposed expansions to the lawful grounds for personal data processing include in select circumstances relating to sensitive information and where the processing is necessary to comply with freedom of expression rights, in an employment context or for historical, statistical and scientific purposes, among others.

Revisions have also been suggested to enable personal data that is collected for a “specified, explicit and legitimate purposes” to be further processed “for historical, statistical or scientific purposes” subject to certain “conditions and safeguards” contained in the draft Regulation. Those conditions require, among other things, that the processing only go ahead if the purposes cannot be achieved by processing anonymised data instead and generally that data attributable to individuals is kept separate from “other information”.

The Council of Ministers’ document also detailed views on the application of data protection laws to information posted on social networks. The Commission’s draft Regulation contains a particular provision that exempts the rules laid out in the text from applying to “the processing of personal data … by a natural person without any gainful interest in the course of its own exclusively personal or household activity”.

The Commission said that this ‘household exemption’ should apply to social network users unless they set their privacy settings to ‘public’ “ie, when personal data are available … to an unrestricted number of individuals and not only to a limited audience at large.” The UK had also argued that “selling personal possessions on an auction site” should also fall under the exemption.

Retrieved from The Register

GOP Senators revise cybersecurity bill

New version of SECURE IT takes less regulatory approach than Democratic-backed Cybersecurity Act, sponsors say
                                                            

A group of Republican senators on Wednesday introduced a revised version of a previously proposed bill that seeks to enhance cybersecurrity by improving the sharing of information between private industry and government.

The new Strengthening and Enhancing Cybersecurity by Using Research, Education, Information and Technology Act (SECURE IT) is being put forth as a less regulatory alternative to another Senate bill, the Cybersecurity Act, which was introduced earlier this year by Senate Democrats.

The main difference between the two bills is that, unlike the Democratic version, the Republican version does not give any new regulatory authority to the federal government to set cybersecurity standards. The new version of SECURE IT also restricts the purposes for which government can retain and use information about cyberthreats.

SECURE IT, backed by Sens. John McCain (R-Ariz.), Kay Bailey Hutchison (R-Texas), Chuck Grassley (R-Iowa), Saxby Chambliss (R-Ga.), Lisa Murkowski (R-Alaska), Dan Coats (R-Ind.), Ron Johnson (R-Wis.), and Richard Burr (R-N.C.), will allow companies to legally share real-time cyberthreat information from their networks with other industry stakeholders, law enforcement agents and government officials.

Security experts believe that such information-sharing is vital to combating cyberattacks. The bill will also encourage investment in tools and training for preventing and remediating cyberattacks.

In addition, SECURE IT seeks to strengthen criminal statutes against cybercrime and will require federal contractors to notify their government customers of any security incidents affecting their services.

Many of the objectives are similar to those proposed in the Cybersecurity Act. What’s different is that SECURE IT does not give the government any new regulatory authority.

The Democratic bill gives the U.S. Department of Homeland Security the right to evaluate the security practices of enterprises that operate components of the nation’s critical infrastructure. It would require operators that are found deficient in their security practices to work with the DHS to remedy the situation.

With SECURE IT, the focus is more on deterrence rather than regulation, according to a statement that the senators who sponsored the bill issued on Wednesday.

“I have no faith that federal regulators should take the lead on cybersecurity,” Sen. Johnson said in the statement. “The regulatory process simply cannot keep up with the rapid pace of technology. Rather than try to impose a comprehensive approach, we need to take this one step at a time — building confidence between government and the private sector, and ensuring protections for civil liberties.”

The revised version of SECURE IT tightens up the definition of cyberthreat information. It also spells out the responsibilities of government organizations and industry stakeholders when sharing information about cyberthreats.

It includes language aimed at ensuring that federal agencies adopt and update security tools for combating cyberthreats. “The surest and quickest way to improve cybersecurity in this country is to leverage the capabilities and flexibility of the private sector instead of creating costly layers of government bureaucracy,” Sen. Coats said in the statement.

House lawmakers passed their version of a similar information-sharing bill (H.R. 3523) in April. That bill, called the Cyber Intelligence Sharing and Protection Act (CISPA), attracted considerable criticism from privacy advocates and others, who fear it will eviscerate privacy rights.

President Obama has threatened to veto any cybersecurity bill that has the provisions that CISPA has.

Retrieved from Computerworld