Friday, April 11, 2014

BREAKING: Third Circuit vacates conviction in United States v. Auernheimer due to improper venue

The United States Court of Appeals for the Third Circuit has just announced that the conviction of Andrew Auernheimer (known by many as Weev) has been reversed on venue grounds.

The opinion states (emphasis added):
This case calls upon us to determine whether venue for  Andrew Auernheimer’s prosecution for conspiracy to violate  the Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, and identity fraud under 18 U.S.C. § 1028(a)(7) was proper in the District of New Jersey. Venue in criminal cases is more than a technicality; it involves “matters that touch closely the fair administration of criminal justice and public confidence in it.” United States v. Johnson, 323 U.S. 273, 276 (1944). This is especially true of computer crimes in the era of mass interconnectivity. Because we conclude  that venue did not lie in New Jersey, we will reverse the District Court’s venue determination and vacate Auernheimer’s  conviction.
More to come. . . 

Thursday, April 10, 2014

Featured Article: The Internet and the Constitution: A Selective Retrospective

The Honorable M. Margaret McKeown of the United States Court of Appeals for the Ninth Circuit has a rather interesting article appearing in volume 9 of the Washington Journal of Law, Technology & Arts.

In her article, The Internet and the Constitution: A Selective Retrospective, Judge McKeown examines the complexities of the Internet and its associated innovations from a legal perspective, from the many jurisdictional and due process challenges, to the implications on the First Amendment and free speech. Judge McKeown's story of "institutional stability in the face of change," however, is one she believes has been lost in the all-to-common narrative: "the Internet is changing all the rules and the system can’t keep up."

I found the entire article fascinating, but for those looking for a cybercrime hook, the article's discussion on “The Fourth Amendment and Privacy,” beginning on page 161, may be of particular interest.

The abstract appears below
Over the last two decades, the Internet and its associated innovations have rapidly altered the way people around the world communicate, distribute and access information, and live their daily lives. Courts have grappled with the legal implications of these changes, often struggling with the contours and characterization of the technology as well as the application of constitutional provisions and principles. Judge M. Margaret McKeown of the United States Court of Appeals for the Ninth Circuit has had a close-up view of many of these Internet-era innovations and the ways the courts have addressed them. In this Article, adapted from her October 2013 Roger L. Shidler Lecture at the University of Washington School of Law, Judge McKeown offers her retrospective thoughts on the ways courts have handled constitutional issues in Internet cases. She also discusses some of the challenges currently facing courts and legislators alike as the U.S. legal system incorporates and accommodates Internet- based technologies and the societal, commercial, governmental, and relational changes they spawn.

Tuesday, April 8, 2014

WI governor signs revenge porn and social media privacy bills into law; privacy bill raises questions

(Update 1: Included link and excerpt from Rep. Sargent's Op-Ed when the bill was introduced, and further comments - to provide some context)

Governor Scott Walker of Wisconsin signed 62 bills into law today, including SB223 (relating to social media privacy) and SB367 (revenge porn).

A full list of the bills he signed can be found here: At a glance: List of 62 bills Gov. Walker signed, and regarding the two bills mentioned above:
Senate Bill 223 – prohibits employers, educational institutions and landlords from requesting or requiring passwords or other protected access to personal internet accounts of students, employees, and tenants. Viewing, accessing and using information from internet accounts, including social media, in the public domain is allowed. Senator Glenn Grothman (R-West Bend) and Representative Garey Bies (R-Sister Bay) authored the bill which unanimously passed the Senate and passed the Assembly on a voice vote; it is Act 208.
Senate Bill 367 – modernizes Wisconsin’s law relating to disseminating private images and expands protections for victims who have their private images distributed without their consent. Senator Leah Vukmir (R-Wauwatosa) and Representative John Spiros (R-Marshfield) authored the bill which passed both the Senate and the Assembly on a voice vote; it is Act 243. 
I criticized the original revenge porn bill proposal in Wisconsin (see: Wisconsin's "revenge porn" bill goes too far. Hypos to ponder and why the legislature should look to Professor Franks ); specifically, I labeled the original proposal as overbroad because the bill did not include a scienter requirement. In the final bill, after a substitute amendment was adopted, the statutory text has been narrowed with just such a requirement. The bill signed into law requires "knowledge":
942.09 (3m) (a) Whoever does any of the following is guilty of a Class A misdemeanor: 
1. Posts, publishes, or causes to be posted or published, a private representation if the actor knows that the person depicted does not consent to the posting or publication of the private representation. 
2. Posts, publishes, or causes to be posted or published, a depiction of a person that he or she knows is a private representation, without the consent of the person depicted.
The social media privacy bill signed by the governor will surely be lauded by privacy advocates as a win for individual autonomy (and freedom from employer/educational institution snooping). But, I find the exceptions to the bill much more intriguing and noteworthy than the protections most will focus on. Particularly, the interesting carve-outs in bold:
(2) Restrictions on employer access to personal Internet accounts.  
   (a) Except as provided in pars. (b), (c), and (d), no employer may do any of the       following:
1. Request or require an employee or applicant for employment, as a condition of employment, to disclose access information for the personal Internet account of the employee or applicant or to otherwise grant access to or allow observation of that account.
2. Discharge or otherwise discriminate against an employee for exercising the right under subd. 1. to refuse to disclose access information for, grant access to, or allow observation of the employee's personal Internet account, opposing a practice prohibited under subd. 1., filing a complaint or attempting to enforce any right under subd. 1., or testifying or assisting in any action or proceeding to enforce any right under subd. 1. 
3. Refuse to hire an applicant for employment because the applicant refused to disclose access information for, grant access to, or allow observation of the applicant's personal Internet account. 
   (b) Paragraph (a) does not prohibit an employer from doing any of the following:

2. Discharging or disciplining an employee for transferring the employer's proprietary or confidential information or financial data to the employee's personal Internet account without the employer's authorization.
3. Subject to this subdivision, conducting an investigation or requiring an employee to cooperate in an investigation of any alleged unauthorized transfer of the employer's proprietary or confidential information or financial data to the employee's personal Internet account, if the employer has reasonable cause to believe that such a transfer has occurred, or of any other alleged employment-related misconduct, violation of the law, or violation of the employer's work rules as specified in an employee handbook, if the employer has reasonable cause to believe that activity on the employee's personal Internet account relating to that misconduct or violation has occurred. In conducting an investigation or requiring an employee to cooperate in an investigation under this subdivision, an employer may require an employee to grant access to or allow observation of the employee's personal Internet account, but may not require the employee to disclose access information for that account.
 So, an employer may not require you to provide access to your personal Internet account on a whim or a hunch. But, if the employer can point to an Acceptable Use Policy, text in an employee handbook, or can establish reasonable cause to believe employment-related misconduct, the employer can require such access. Sure, you don't have to provide your login/password, but in subsection 3, above, you could be required to grant access (whatever that means).

The social media bill's carve-outs sound a lot like CFAA cases of late, and also general social media prying lawsuits as well. How, then, is this bill a boon for employee/student privacy? Also, if my employer requested I grant access to a personal account, as part of an "investigation," I would almost assuredly deny that request, absent a subpoena. I am very curious how these exceptions will be used by employers going forward.

Update 1: 

Rep. Sargent wrote an Op-Ed in the Milwaukee Journal Sentinel when she proposed the bill (with other representatives). See here: Bipartisan bill protects social media accounts

Later, after the bill made it out of the Senate on a 33-0 vote, Sargent issued a press release. See here: Social Media Protection Bill Passes Senate on a 33-0 Vote. An interesting quote from the release:
I’m pleased that this common sense, bi-partisan legislation advanced further through the legislative process today.  It makes sense that personal internet accounts should be given the same, 4th Amendment protections as other aspects of our daily lives.  People have a reasonable expectation of privacy when interacting with their friends and family on Facebook or other sites. An employer, university, or landlord should not have access to private communications on social media sites. As technology evolves, so must our legislative efforts to protect our citizen’s privacy. The current generation will write the laws on social media.  We must do it carefully and with respect for all parties involved.
There should, in my opinion, be an asterisk (*) after that paragraph, noting that the exceptions may indeed swallow a large chunk of the well-intentioned proposal. If the bill's intent was to prevent forced disclosure of account credentials, then the text should have narrowly reflected that (considering, to wit, that the exceptions do not require providing credentials, but merely providing/granting access). Further, just as some courts have attempted to bring TOS/Acceptable Use Policies/Employee Handbooks within the ambit of CFAA liability, this bill allows varying employer-defined standards to dictate whether an employee must grant access to a social media/personal email account.

Hypo: If an employee handbook states no surfing the internet for personal reasons (or updating social media) during work hours and there is "reasonable cause" to believe that a violation occurred - must the employee grant access to the account to prove otherwise? How is that giving personal internet accounts "4th Amendment protections...[similar to those in] other aspects of our daily lives?" What if the employee refuses to grant access - is that grounds for termination?

More fundamentally, though, is this question: now that the bill has become law, who benefitted more from its enactment: employers, or employees?

Monday, April 7, 2014

Court Rules in Favor of FTC, Wyndham Must Face Suit Over Data Breach

Today, a ruling was issued in FTC v. Wyndham Worldwide Corp. The court denied Wyndham's motion to dismiss, rejecting its argument that the Federal Trade Commission does not have authority under Section 5 of the FTC Act to regulate data security practices across all industries.

The U.S. District Court for the District of New Jersey declined to carve out a data-security exception to the FTC's broad regulatory authority under Section 5. It also refused to require the FTC to promulgate data security regulations before bringing "unfairness" claims against companies based on their data security practices, noting that previous enforcement actions "'constitute a body of experience and informed judgment to which courts and litigants may properly resort for guidance.'"

U.S. District Judge Esther Salas made clear that "this decision does not give the FTC a blank check to sustain a lawsuit against every business that has been hacked." However, the ruling disposes of the only viable challenge to the FTC's authority to regulate data security practices.

FTC Chairwoman Edith Ramirez issued a statement on the ruling via Twitter: I wrote about the Wyndham litigation in a previous post. I look forward to further analyzing Judge Salas' ruling in a future post.

Wednesday, April 2, 2014

Undeterred by Challenges to its Authority, FTC Settles Data Security Actions with Credit Karma and Fandango

The Federal Trade Commission (FTC) has settled two more enforcement actions with companies that failed to adequately safeguard consumers’ personal information, despite challenges to its authority to regulate data security practices.

Credit Karma and Fandango Settle FTC Charges

Last week, the FTC announced that credit monitoring service Credit Karma and movie ticket outlet Fandango entered into settlement agreements that will require the companies to submit to 20 years of independent security audits, improve security measures, and refrain from misrepresenting their security and privacy processes. The FTC had charged both companies with violating Section 5 of the FTC Act (Section 5), which prohibits “unfair or deceptive acts or practices in or affecting commerce.” The agency alleged that Fandango and Credit Karma had engaged in unfair business practices by failing to properly implement Secure Sockets Layer (SSL) encryption on their mobile apps, thus leaving users’ payment information and other sensitive data vulnerable to “man-in-the-middle” attacks. The FTC also alleged that Fandango and Credit Karma had misrepresented the security of their apps, thereby deceiving customers.

Since 2002, the FTC has brought and settled more than 50 similar data security enforcement actions against companies including Twitter, Rite Aid, and Petco. The FTC claims that it has broad authority under Section 5 to investigate and censure the data security missteps of companies across all industries, even though there is currently no overarching federal law mandating minimum data security standards.

Until recently, the FTC’s authority to regulate data security practices under Section 5 had gone largely uncontested. But in a highly-anticipated decision, a New Jersey federal court may provide guidance as to the extent of this authority.

FTC v. Wyndham Poses the First Serious Challenge to FTC Authority Over Data Security

In June 2012, the FTC filed a complaint against global hospitality company Wyndham Worldwide Corporation in federal district court, alleging that Wyndham “failed to provide reasonable and appropriate security” measures on their computer networks, which led to a series of large-scale breaches of personal information and more than $10.6 million in fraudulent charges to customers’ accounts.

Specifically, the FTC charged that Wyndham engaged in deceptive business practices in violation of Section 5 by misrepresenting in its privacy policies and elsewhere the security measures it employed to prevent the unauthorized access of customer data. The agency further alleged that Wyndham’s failure to maintain reasonable data security constituted an unfair business practice, also in violation of Section 5.

Wyndham responded by filing a motion to dismiss both the deception and the unfairness claims in the FTC’s complaint. Wyndham asserted, inter alia, that the FTC “has neither the expertise nor the statutory authority to establish data security standards for the private sector” under the “unfairness” prong of Section 5. Wyndham pointed out that the FTC has publicly acknowledged that it “lacks authority to require firms to adopt information practice policies,” and that it has repeatedly asked Congress to grant it broad, cross-industry authority to do so. Instead, Congress has enacted industry-specific legislation – such as the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Fair Credit Reporting Act (FCRA) – none of which authorized the FTC to bring an action against Wyndham.

In its reply, the FTC argued that Congress deliberately delegated broad authority to the FTC under Section 5 to “permit the FTC to protect consumers from unanticipated, unenumerated threats.” The FTC cited a range of uses of its Section 5 authority that were upheld by the courts, including the regulation of online check drafting and delivery, telephone billing practices, sales of telephone records, and sales of unsafe farm equipment.

In November 2013, Judge Esther Salas of the U.S. District Court for the District Court of New Jersey heard lengthy oral arguments on Wyndham’s motion to dismiss. Counsel for Wyndham argued that a lack of clear statutory authority for the FTC to regulate data security, coupled with the August 2013 release of a draft cybersecurity framework by the National Institute of Standards and Technology, demonstrated that Congress did not intend for the FTC to take the lead on data security enforcement.

At the conclusion of oral arguments, Judge Salas seemed poised to rule in favor of the FTC, denying a motion by Wyndham to stay discovery until she ruled on its motion to dismiss. In January, however, Judge Salas agreed to delay her ruling and allow supplemental briefing after an FTC Commissioner commented on the vagueness in the “unfairness” prong of the FTC’s Section 5 authority during congressional testimony.

A ruling is expected in the coming weeks. If Judge Salas rules in favor of Wyndham, she could seriously undermine the FTC’s authority over data security practices going forward. If she denies Wyndham’s motion to dismiss, the decision could pave the way for increased data security enforcement by the FTC.

After an Unsuccessful Challenge to FTC’s Authority, LabMD to Shut Down

Following Wyndham’s lead, another company challenged the FTC’s authority to regulate data security in an enforcement action brought by the FTC in August 2013. The FTC charged LabMD, a clinical health testing company, with violating Section 5 after the sensitive personal information of 9,300 people was exposed via a public file-sharing network, leading some to have their identities stolen.

In November 2013, LabMD filed a motion to dismiss, arguing that the FTC does not have authority to regulate data security practices with respect to patient health data under the “unfairness” prong of Section 5. LabMD claimed that because it provided cancer diagnoses to the patients of its physician-customers, that its information practices are regulated under HIPAA, which it had not been accused of violating. In its response, the FTC argued that it shares concurrent authority with the Department of Health and Human Services over health information security. Once again, the FTC maintained that Section 5 gives it broad authority over “unfair” data security practices.

In January, the FTC issued an order denying LabMD’s motion to dismiss. It concluded that Congress delegated broad authority to the FTC to regulate “unfair acts or practices,” including those of HIPAA-covered entities. The FTC reiterated its argument in Wyndham that federal courts had upheld its Section 5 authority in a wide variety of contexts. 

Just days after the FTC’s order, LabMD announced that it would shut down, citing the “debilitating effects” of the FTC’s four-year investigation of the company and calling it an “abuse of power.”

LabMD has twice requested federal court review of the FTC’s actions, but the cases were subsequently dismissed and withdrawn. It is not clear whether the company will seek further review.

Thus, the Wyndham litigation presents the only viable challenge to the FTC’s data security enforcement efforts at this time.

Data Security is a Top FTC Priority

Though questions about the FTC’s authority to regulate data security practices remain, the FTC has made data security a “top priority” and shows no signs of slowing its enforcement efforts in this area. Accordingly, federal regulatory action is a very real threat to companies across all industries that fail to implement reasonable data security measures.

Cybercrime Review welcomes Natalie Nicol as a guest writer

I am excited to welcome Natalie Nicol as a guest writer for Cybercrime Review. She hopes to contribute to the blog regularly.

Natalie received her J.D. from University of California, Hastings College of the Law in 2013. During law school, Natalie worked at the Digital Media Law Project, a project of the Berkman Center for Internet & Society at Harvard University; the Electronic Frontier Foundation; and the First Amendment Project. She served as the symposium editor for the Hastings Communication and Entertainment Law Journal, and presented a day-long symposium on the Computer Fraud and Abuse Act last March. She is a graduate of the Walter Cronkite School of Journalism and Mass Communication at Arizona State University.

Natalie’s interests include Internet law, privacy, free expression, and intellectual property issues. In her free time, she enjoys live music and spending time with her dogs, Cleopatra and Penny. In her current role, she develops online content for lawyers and law firms across the country.
You can follow Natalie on Twitter at @natnicol.

Tuesday, March 18, 2014

Featured Paper: Is the Court Allergic to Katz? Problems Posed by New Methods of Electronic Surveillance to the "Reasonable-Expectation-of Privacy" Test

Colin Shaff, a 3L at USC, has a new student note worth checking out: Is the Court Allergic to Katz? Problems Posed by New Methods of Electronic Surveillance to the "Reasonable-Expectation-of Privacy" Test. It appears in the Spring 2014 edition of the Southern California Interdisciplinary Law Journal.

The note discusses Jones, Katz, Olmstead, FISA, the ECPA, and many other facets of Fourth Amendment jurisprudence.

An excerpt from the intro:
This Note will examine the way in which the Court and Congress have reacted to the challenges posed by emerging technology with regards to the Fourth Amendment’s “unreasonable search and seizure” clause. This Note argues that the best balance between protecting personal liberties and respecting the needs of law enforcement occurs when the Court, Congress, and state legislatures collaborate to craft robust statutory schema; in contrast, when the Court makes decisions without legislative input or when Congress acts without judicial guidance, the resulting law is often inadequate or incomplete. 

Tuesday, February 25, 2014

Featured Paper: Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data

From the February 2014 Issue of the Cardozo Law Review:

Zachary Ross, Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data.

Technological change is often a double-edged sword--it enables and enriches our lives, but also allows for new means of exploitation and control. As social, architectural, and market barriers protecting longstanding notions of personal space erode, individuals increasingly rely on the legal system as a defense to arbitrary invasions of privacy. Paradoxically, the same forces that make the need for robust privacy protections more compelling also make the existing legal framework outdated and inapposite. 
These contradictions are readily apparent in the contemporary debate over the legal restrictions on government access to cell site location information (CSLI). This data, constantly collected by cell phone service providers (CSPs) in order to manage their networks, has the potential to provide a detailed map of an individual cell user's movements from place to place over extended periods of time. Furthermore, the quantity and precision of location data collected by CSPs is constantly increasing, becoming more revealing, and more valuable to law enforcement in the process. Despite the potential intimacy of this data and its growing relevance to criminal investigations, the legal protection afforded CSLI is hotly disputed, and at present varies greatly among (and sometimes even within) jurisdictions-- with courts sometimes requiring a warrant, and sometimes allowing unfettered access upon a lesser evidentiary showing. This lack of uniformity has been exacerbated by a recent Fifth Circuit ruling on government access to CSLI, which generated a different rule than had previously been adopted by the Third Circuit. The vastly disparate treatment of government requests for CSLI has created a chaotic system ripe for abuse, and all but guaranteed Supreme Court review of the issue in the near future, as the Court itself seems to have implicitly acknowledged. 
This Note will examine the complex interaction between privacy, surveillance, and technology through an exploration of the contested legal terrain governing law enforcement access to historical CSLI--location data recorded by CSPs which reveal an individual's past movements. 

Monday, February 10, 2014

Personal Data Protection and Breach Accountability Act of 2014 would enact criminal penalties for "intentionally or willfully" concealing a security breach

Thanks in part to the recent security breaches at Target and Neiman Marcus, pressure for a federal response to data security has become increasingly popular. Numerous bills have been introduced in the House and the Senate that call for new legislative enactments to answer the data security problem.

A somewhat popular proposal for many of these bills is a new criminal statute for individuals who knowingly and willingly fail to report a known security breach. I recently introduced readers to Senator Patrick Leahy’s Personal Data Privacy and Security Act of 2014, and detailed some of the bill's criminal proposals, including numerous amendments to the Computer Fraud and Abuse Act. The bill also included a proposed criminal statute that would read,
Whoever, having knowledge of a security breach and of the fact that notice of such security breach is required under title II of the Personal Data Privacy and Security Act of 2014, intentionally and willfully conceals the fact of such security breach, shall, in the event that such security breach results in economic harm to any individual in the amount of $1,000 or more, be fined under this tile [sic] or imprisoned for not more than 5 years, or both. 
Last Tuesday, ahead of a Senate Judiciary Committee hearing addressing the Target and Neiman Marcus data breaches, Senator Richard Blumenthal and Senator Ed Markey introduced the Personal Data Protection and Breach Accountability Act of 2014. According to a recent press release, Senator Blumenthal stated that the bill “will give consumers much stronger, industry-wide protections against massive thefts of private financial information” and that “[s]tiffer enforcement with stringent penalties are vital to assure that retailers use state of the art safeguards.” Similar to Senator Leahy’s bill, the Personal Data Protection and Breach Accountability Act of 2014 would include a new criminal statute that would read, 
Whoever, having knowledge of a security breach and of the fact that notice of such security breach is required under title II of the Personal Data Protection and Breach Accountability Act of 2014, intentionally or willfully conceals the fact of such security breach and which breach, shall, in the event that such security breach results in economic harm or substantial emotional distress to 1 or more persons, shall be fined under this title or imprisoned not more than 5 years, or both.
A notable difference between these two proposals is the Personal Data Protection and Breach Accountability Act’s requirement that the breach “results in economic harm or substantial emotional distress to 1 or more persons.” In my eyes, this would encompass significantly more security breaches than in Senator Leahy's already broad proposal.

In a recent op-ed for the International Association of Privacy Professional’s online publication, Privacy Perspectives, I question whether criminal liability for failing to disclose a data security breach would be a prudent move, focusing specifically on Senator Leahy’s bill. My concerns would extend to this new proposal as well.

It will be interesting to see, with such an outcry for a federal response, what (if anything) will be adopted, and whether some variation of these "criminal concealment of a known security breach" proposals will be included.

Thursday, February 6, 2014

Quick note: Oklahoma Appellate Court: No reasonable expectation of privacy in text messages sent to another person's phone

The case is State v. Marcum, No. S-2012-976 (OK App. Ct. Jan. 28, 2014).'s summary of the case (with slight modifications):
A search warrant was used to obtain [cell phone text records]. [The court held] that [t]here is no reasonable expectation of privacy in the records of another person’s account <even if those records pertain to you>. When you hit “send” and [transmit a] text message to another person, you’ve lost any reasonable expectation of privacy in the message.
The court's holding can be boiled down to the following paragraph:
Addressing only the narrow question before us, Marcum has not demonstrated a reasonable expectation of privacy in the records seized from U.S. Cellular for Miller's phone account. This Court adopts the reasoning of the courts which have concluded that there is no expectation of privacy in the text messages or account records of another person, where the defendant has no possessory interest in the cell phone in question, and particularly where, as here, the actual warrant is directed to a third party. 
The court frames it as a novel issue, but I think the third-party doctrine compels a rather straightforward outcome.

Wis. Sup. Ct. 4th Amendment case: if a probation condition bans possessing a PC (i.e. contraband), you can seize it, but can you search it?

The Wisconsin Supreme Court heard oral arguments today in State v. Purtell, 2012AP001307 - CR (Wis. Sup. Ct. 2014) (link to PDF of docket). A summary from the Wisconsin Public Defender's "On Point site" gives a good synopsis:
Purtell was on probation for animal cruelty convictions, and as a condition of probation was allowed access to computers only for school or work. After Purtell admitted having a laptop at home, his agent went to his home and removed the laptop. She found files showing females, some appearing to be very young, engaged in sexual acts with animals; after a warrant to search the computer was obtained based on that information, police found child pornography. The sole issue on appeal was whether the agent had reasonable suspicion to search Purtell’s computer for “contraband,” which the state argued included images of animal cruelty. The court of appeals held there was no reasonable suspicion, first because Purtell’s conditions of probation didn’t expressly prohibit him from possessing such images, and, second, because the state pointed to no reasonable grounds to believe there was some other kind of contraband on the laptop, but relied only on “generally suspicious” behavior. 
The oral argument can be found in Wisconsin Supreme Court oral argument archive, or click here for a directly link to the streaming wma file.

The Supreme Court's summary of the case can be found in its February oral argument preview. The State (Petitioner), frames the issue in this manner:
The content of Purtell’s computer, like the computer itself, was contraband regardless of whether Purtell’s probation included a rule or condition prohibiting the possession of images depicting cruelty to animals.
Purtell (Respondent), frames it like this:
The Images Retrieved from Mr. Purtell’s Computer Were Inadmissible Because the Probation Agent Did Not Have Reasonable Grounds to Believe the Computer Contained Contraband. 
Courts must separately analyze the reasonableness of a search for a computer and a search of the contents of a computer.
The State's Reply Brief can be found here.

The appellate court, which reversed and remanded the trial court, gives the following background synopsis:
Purtell was convicted of two counts of cruelty to animals, one resulting in the death of the animal, and he was placed on probation.  One condition of Purtell’s probation was that he not own or possess a computer and that he could only use a computer “at his place of business or school.”  The purpose of this prohibition may have been to limit Purtell’s access to certain types of images, but the conditions of his probation did not actually impose a limitation on the types of images or written materials Purtell could possess. 
At a meeting with his probation agent, Purtell complained about the no-computer condition.  Purtell told the agent that he had a working laptop and a desktop computer that did not work, both at his residence.  Purtell also told the agent that he had a Myspace account and gave the agent his Myspace password. 
For reasons that do not matter for purposes of this appeal, Purtell’s agent subsequently went to Purtell’s residence and removed his laptop and desktop computers.  The seizure of Purtell’s computers is not challenged.  Later, at her office, the agent looked at the contents of one of Purtell’s computers. The agent “clicked on files” and observed that titles of the files did not always match the images that were in the files.  The agent located files showing females engaged in sexual acts with animals.  The agent later testified:  “[A] number of the files, when we opened them, had names of like very young females.  [And there was] concern at some point that this was sex involving underage females.” 
Based on information that Purtell’s agent gained from looking at the contents of Purtell’s computers, law enforcement subsequently obtained a warrant to search the computers. The resulting further search revealed a large volume of still images and “videos” depicting young children engaged in sex acts.   
Purtell was charged with eight counts of possession of child pornography.  He moved to suppress the evidence resulting from the search of his computers, arguing that his probation agent performed an illegal warrantless search.  At a hearing on this suppression motion, Purtell’s probation agent testified that, prior to searching the contents of one of Purtell’s computers at her office, she looked at Purtell’s Myspace account.  On that account, she saw pictures of “animals that were partially human,” such as a “woman that was half woman and half a cow.”  The agent testified that, based on what she saw on Purtell’s Myspace account, she thought Purtell’s computers might have “files regarding cruelty to animals or death and mutilation of animals.”  She was concerned about Purtell’s mental health issues. 
After hearing testimony and viewing evidence, the circuit court denied Purtell’s suppression motion.  The court concluded that the agent had “legitimate reasons of probation supervision to view the [contents of the] computers.”  The court stated that the images the agent saw on Purtell’s Myspace account gave the agent reason to believe that there was contraband on Purtell’s computers.
The substance of the appellate court's decision:
As Purtell makes clear, he does not challenge the search of his residence or the seizure of his computers.  Rather, he challenges the search of the contents of his computers.  Indeed, the State and Purtell agree that the issue here is whether Purtell’s probation agent had “reasonable grounds” to believe that Purtell’s computers contained “contraband.”  The parties further agree that “contraband,” for purposes of this case, means any item that Purtell was not allowed to possess under the conditions of his supervision or any item whose possession is forbidden by law. 
So far as we can tell, the State’s sole argument on appeal is that, based on several pieces of information, Purtell’s probation agent had reasonable grounds to believe that Purtell’s computers contained images depicting cruelty to animals or the mutilation of animals, and that such images were “contraband.” However, even if we were persuaded that there were reasonable grounds to believe that Purtell’s computers contained images depicting cruelty to animals or the mutilation of animals, the State fails to demonstrate that such images are “contraband.”

…before this court and the circuit court, the State simply pointed to behavior that was generally suspicious, such as the fact that Purtell possessed the computers at home in violation of the conditions of his probation and Purtell’s failure to attend a scheduled mental health treatment appointment.  These and other factors may have justified the probation agent taking some action, but they do not supply “reasonable grounds” to believe that Purtell’s computers contained contraband.  As we have explained, the State’s argument in this regard appears to be based on the faulty assumption that Purtell’s probation conditions prohibited him from possessing images depicting cruelty to animals or the mutilation of animals.  Having rejected that assumption, the State’s arguments leave us with no basis to affirm the circuit court’s denial of Purtell’s suppression motion.  

Tuesday, February 4, 2014

Massive round-up of new law articles, covering privacy, Fourth Amendment, GPS, cell site, cybercrime, big data, revenge porn, drones, and more

This Article examines a question that has become increasingly important in the emerging surveillance society: Should the law treat information as private even though others know about it? This is the third-party privacy problem. Part II explores two competing conceptions of privacy — the binary and contextual conceptions. Part III describes two features of the emerging surveillance society that should change the way we address the third-party privacy problem. One feature, “surveillance on demand,” results from exponential increases in data collection and aggregation. The other feature, “uploaded lives,” reflects a revolution in the type and amount of information that we share digitally. Part IV argues that the binary conception cannot protect privacy in the surveillance society because it fails to account for the new realities of surveillance on demand and uploaded lives. Finally, Part V illustrates how courts and legislators can implement the contextual conception to deal with two emerging surveillance society problems — facial recognition technology and geolocation data.

Privacy laws rely on the unexamined assumption that the collection of data is not speech. That assumption is incorrect. Privacy scholars, recognizing an imminent clash between this long-held assumption and First Amendment protections of information, argue that data is different from the sort of speech the Constitution intended to protect. But they fail to articulate a meaningful distinction between data and other more traditional forms of expression. Meanwhile, First Amendment scholars have not paid sufficient attention to new technologies that automatically capture data. These technologies reopen challenging questions about what “speech” is. 
This Article makes two overdue contributions to the First Amendment literature. First, it argues that when the scope of First Amendment coverage is ambiguous, courts should analyze the government’s motive for regulating. Second, it highlights and strengthens the strands of First Amendment theory that protect the right to create knowledge. Whenever the state regulates in order to interfere with the creation of knowledge, that regulation should draw First Amendment scrutiny. 
In combination, these claims show clearly why data must receive First Amendment protection. When the collection or distribution of data troubles lawmakers, it does so because data has the potential to inform and to inspire new opinions. Data privacy laws regulate minds, not technology. Thus, for all practical purposes, and in every context relevant to privacy debates, data is speech.
The police tend to think that those who evade surveillance are criminals. Yet the evasion may only be a protest against the surveillance itself. Faced with the growing surveillance capacities of the government, some people object. They buy “burners” (prepaid phones) or “freedom phones” from Asia that have had all tracking devices removed, or they hide their smartphones in ad hoc Faraday cages that block their signals. They use Tor to surf the internet. They identify tracking devices with GPS detectors. They avoid credit cards and choose cash, prepaid debit cards, or bitcoins. They burn their garbage. At the extreme end, some “live off the grid” and cut off all contact with the modern world. 
These are all examples of what I call privacy protests: actions individuals take to block or to thwart government surveillance for reasons unrelated to criminal wrongdoing. Those engaged in privacy protests do so primarily because they object to the presence of perceived or potential government surveillance in their lives. How do we tell the difference between privacy protests and criminal evasions, and why does it matter? Surprisingly scant attention has been given to these questions, in part because Fourth Amendment law makes little distinction between ordinary criminal evasions and privacy protests. This Article discusses the importance of these ordinary acts of resistance, their place in constitutional criminal procedure, and their potential social value in the struggle over the meaning of privacy.
Conor M. Reardon, Cell Phones, Police Recording, and the Intersection of the First and Fourth Amendments, 63 Duke Law Journal 735-779 (2013). Abstract:
In a recent spate of highly publicized incidents, citizens have used cell phones equipped with video cameras to record violent arrests. Oftentimes they post their recordings on the Internet for public examination. As the courts have recognized, this behavior lies close to the heart of the First Amendment. 
But the Constitution imperfectly protects this new form of government monitoring. Fourth Amendment doctrine generally permits the warrantless seizure of cell phones used to record violent arrests, on the theory that the recording contains evidence of a crime. The Fourth Amendment inquiry does not evaluate a seizing officer’s state of mind, permitting an official to seize a video for the very purpose of suppressing its contents. Moreover, Supreme Court precedent is typically read to ignore First Amendment interests implicated by searches and seizures. 
This result is perverse. Courts evaluating these seizures should stop to recall the Fourth Amendment’s origins as a procedural safeguard for expressive interests. They should remember, too, the Supreme Court’s jurisprudence surrounding seizures of obscene materials—an area in which the Court carefully shaped Fourth Amendment doctrine to protect First Amendment values. Otherwise reasonable seizures can become unreasonable when they threaten free expression, and seizures of cell phones used to record violent arrests are of that stripe. Courts should therefore disallow this breed of seizure, trusting the political branches to craft a substitute procedure that will protect law-enforcement interests without doing violence to First Amendment freedoms.
Elizabeth Friedler, Protecting the Innocent—the Need to Adapt Federal Asset Forfeiture Laws to Protect the Interests of Third Parties in Digital Asset Seizures, Cardozo Arts & Entertainment Law Journal, Volume 32, Issue 1 (2013).

Jana Sutton, Of Information, Trust, and Ice Cream: A Recipe for a Different Perspective on the Privacy of Health Information, 55 Ariz. L. Rev. 1171 (2014). Abstract:
The concept of privacy is inescapable in modern society. As technology develops rapidly and online connections become an integral part of our daily routines, the lines between what may or may not be acceptable continue to blur. Individual autonomy is important. We cannot, however, allow it to suffocate the advancement of technology in such vital areas as public health. Although this Note cannot lay out detailed instructions to balance the desire for autonomy and the benefits of free information, it attempts to provide some perspective on whether we are anywhere close to striking the right balance. When the benefits of health information technology are so glaring, and yet its progress has been so stifled, perhaps we have placed far too much value—at least in the health care context—on individual privacy.
Kevin S. Bankston & Ashkan Soltani, Tiny Constables and the Cost of Surveillance: Making Cents Out of United States v. Jones, 123 YALE L.J. ONLINE 335 (2014). Abstract:
In United States v. Jones, five Supreme Court Justices wrote that government surveillance of one’s public movements for twenty-eight days using a GPS device violated a reasonable expectation of privacy and constituted a Fourth Amendment search. Unfortunately, they didn’t provide a clear and administrable rule that could be applied in other government surveillance cases. In this Essay, Kevin Bankston and Ashkan Soltani draw together threads from the Jones concurrences and existing legal scholarship and combine them with data about the costs of different location tracking techniques to articulate a cost-based conception of the expectation of privacy that both supports and is supported by the concurring opinions in Jones.
Schmitt, Michael N. and Vihul, Liis, The International Law of Attribution During Proxy 'Wars' in Cyberspace (January 30, 2014). 1 Fletcher Security Review (2014 Forthcoming). Abstract:
The article examines the use of non-State actors by States to conduct cyber operations against other States. In doing so, it examines attribution of a non-State actor's cyber operations to a State pursuant to the law of State responsibility, attribution of a non-State actor's cyber armed attack to a State for the purposes of a self-defense analysis, and attribution of cyber military operations to a State in the context of determining whether an international armed conflict has been initiated. These three very different legal inquiries are often confused with each other. The article seeks to deconstruct the issue of attribution into its various normative components.
Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms, 55 B.C. L. Rev. 93 (2014). Abstract:
The rise of “Big Data” analytics in the private sector poses new challenges for privacy advocates. Through its reliance on existing data and predictive analysis to create detailed individual profiles, Big Data has exploded the scope of personally identifiable information (“PII”). It has also effectively marginalized regulatory schema by evading current privacy protections with its novel methodology. Furthermore, poor execution of Big Data methodology may create additional harms by rendering inaccurate profiles that nonetheless impact an individual’s life and livelihood. To respond to Big Data’s evolving practices, this Article examines several existing privacy regimes and explains why these approaches inadequately address current Big Data challenges. This Article then proposes a new approach to mitigating predictive privacy harms—that of a right to procedural data due process. Although current privacy regimes offer limited nominal due process-like mechanisms, a more rigorous framework is needed to address their shortcomings. By examining due process’s role in the Anglo-American legal system and building on previous scholarship about due process for public administrative computer systems, this Article argues that individuals affected by Big Data should have similar rights to those in the legal system with respect to how their personal data is used in such adjudications. Using these principles, this Article analogizes a system of regulation that would provide such rights against private Big Data actors.
Larkin, Paul J., 'Revenge Porn,' State Law, and Free Speech (January 14, 2014).  Abstract:
For most of our history, only celebrities — presidents, movie stars, professional athletes, and the like — were at risk of having their everyday exploits and activities photographed and shown to the world. But that day is gone. Today, we all face the risk of being made into a celebrity due to the ubiquity of camera-equipped cell phones and the ease of uploading photographs or videos onto the Internet. But a particularly troubling aspect of this phenomenon goes by the name of "revenge porn" — that is, the Internet posting of photographs of naked former wives and girlfriends, sometimes in intimate positions or activities. Revenge porn is an example of malicious conduct that injures the welfare of someone who mistakenly trusted an intimate partner. Tort law traditionally has allowed parties to recover damages for such violations of privacy, and criminal law also can prohibit such conduct, but there are several First Amendment defenses that the responsible parties can assert to fend off liability. This article argues that allowing a victim of revenge porn to recover damages for publication that breaches an implicit promise of confidentiality is faithful to tort and criminal law principles and will not punish or chill the legitimate expression of free speech.
Jonathan Olivito, Beyond the Fourth Amendment: Limiting Drone Surveillance Through the Constitutional Right to Informational Privacy, 74 Ohio St. L.J. 669 (2013). 

The entirety of Volume 74, Issue 6 in the Ohio State Law Journal; Symposium: The Second Wave of Global Privacy Protection (Titles Below)
Peter Swire, The Second Wave of Global Privacy Protection: Symposium Introduction, 74 Ohio St. L.J. 841 (2013). 
Ann Bartow, Privacy Laws and Privacy Levers: Online Surveillance Versus Economic Development in the People’s Republic of China, 74 Ohio St. L.J. 853 (2013). 
Andrew Clearwater & J. Trevor Hughes, In the Beginning . . . An Early History of the Privacy Profession, 74 Ohio St. L.J. 897 (2013). 
Claudia Diaz, Omer Tene & Seda G├╝rses, Hero or Villain: The Data Controller in Privacy Law and Technologies, 74 Ohio St. L.J. 923 (2013). 
A. Michael Froomkin, “PETs Must Be on a Leash”: How U.S. Law (and Industry Practice) Often Undermines and Even Forbids Valuable Privacy Enhancing Technology, 74 Ohio St. L.J. 965 (2013). 
Woodrow Hartzog, Social Data, 74 Ohio St. L.J. 995 (2013). 
Dennis D. Hirsch, In Search of the Holy Grail: Achieving Global Privacy Rules Through Sector-Based Codes of Conduct, 74 Ohio St. L.J. 1029 (2013). 
Gus Hosein & Caroline Wilson Palow, Modern Safeguards for Modern Surveillance: An Analysis of Innovations in Communications Surveillance Techniques, 74 Ohio St. L.J. 1071 (2013). 
Anil Kalhan, Immigration Policing and Federalism Through the Lens of Technology, Surveillance, and Privacy, 74 Ohio St. L.J. 1105 (2013). 
Bartosz M. Marcinkowski, Privacy Paradox(es): In Search of a Transatlantic Data Protection Standard, 74 Ohio St. L.J. 1167 (2013). 
Thomas Margoni & Mark Perry, Deep Pockets, Packets, and Harbors, 74 Ohio St. L.J. 1195 (2013). 
Omer Tene, Privacy Law’s Midlife Crisis: A Critical Assessment of the Second Wave of Global Privacy Laws, 74 Ohio St. L.J. 1217 (2013). 
Yofi Tirosh & Michael Birnhack, Naked in Front of the Machine: Does Airport Scanning Violate Privacy? 74 Ohio St. L.J. 1263 (2013). 
Yang Wang, Pedro Giovanni Leon, Xiaoxuan Chen, Saranga Komanduri, Gregory Norcie, Kevin Scott, Alessandro Acquisti, Lorrie Faith Cranor & Norman Sadeh, From Facebook Regrets to Facebook Privacy Nudges, 74 Ohio St. L.J. 1307 (2013). 
Tal Z. Zarsky & Norberto Nuno Gomes de Andrade, Regulating Electronic Identity Intermediaries: The “Soft eID” Conundrum, 74 Ohio St. L.J. 1335 (2013).
The entirety of Volume 14, Issue 1 of the  Journal of High Technology Law (2014) (Titles Below).
After Jones, The Deluge: The Fourth Amendment's Treatment Of Information, Big Data And The Cloud , Lon A. Berk, 14 J. High Tech L. 1 (2014). 
The Legislative Response To Employers' Requests For Password Disclosure, Jordan M. Blanke, 14 J. High Tech L. 42 (2014). 
A Shot In The Dark: An Analysis Of The SEC's Response To The Rise Of Dark Pools Edwin Batista, 14 J. High Tech L. 83 (2014). 
Privacy Protections Left Wanting: Looking At Doctrine And Safeguards On Law Enforcements' Use Of GPS Tracking And Cell Phone Records With A Focus On Massachusetts, Lloyd Chebaclo, 14 J. High Tech L. 120 (2014).