Feinstein versus the CIA: Incident reveals hypocrisy and incompetence

What do you call it when someone rifles through your computer files without your knowledge? Employers call it monitoring, and civil libertarians call it surveillance. U. S. Senator Dianne Feinstein calls it a “separation of powers” issue.

So Feinstein described her headline-grabbing dustup last year with the Central Intelligence Agency, in which the agency spied on staffers of the Senate Intelligence Committee while the committee was investigating certain interrogations by the CIA.

The squabble started after the CIA set up a dark network in an out-of-the-way location for Senate investigators to view specified CIA documents. When the agency realized it had not sufficiently blocked access to other files that should have remained hidden, five CIA staffers tried to assess the damage by snooping through Intelligence Committee computer files, and even reading email messages. Feinstien went public when she found out, blasting the agency for “violating separation of powers.”

While the senator rooted her fury in the CIA’s apparent intention to undermine her investigation, she discussed the incident in the language of inter-branch conflict, rather than call it what it was — spying by a spy agency. This suggests she stands by her vilification of Edward Snowden and her defense of the programs he leaked, while claiming that her own digital affairs should be off-limits to the prying eyes of alphabet soup agencies. In other words, Constitutional principles apply differently to powerful elected officials than to regular citizens.

Feinstein repeated the separation-of-powers charge when the story resurfaced this week, after a CIA investigative panel cleared the agency’s five snoops of wrongdoing. The agency’s internal report found its employees’ hacking activities had been “clearly inappropriate,” but were not cause for discipline.

But the report revealed something else: There were very basic failures of information governance that enabled the senate committee to grab documents it shouldn’t have, and that subsequently justified clearing the CIA of bad faith or malfeasance.

The Washington Post reports: “… the accountability review board concluded that the CIA-Senate arrangement was so convoluted that the panel could find no clear rules on how the shared computer system was to be run, let alone whether any rules had been violated.”

Notwithstanding the novelty of the “arrangement,” please note that the world’s most powerful spy agency and the nation’s most powerful legislative body abdicated well-established standards that would have determined in advance the protocol for a digital data transaction. This planning is fundamental to digital security and privacy.

Contemplating the agency’s sloppy information governance, especially given what was at stake, should lead to serious doubts about extending to any federal agency more authority to collect, store, or probe the digital records of Americans. For anyone who’s still not clear that the federal government is a poor steward of information, please do a careful reading of the above linked story in the Post.

That brings us (briefly) to President Obama’s 2015 cybersecurity initiatives. The president’s package would facilitate continued government access to citizen communications, and would snag security researchers, journalists, lawyers, and others in a net cast much too wide for cybercriminals. A recent surge in global terrorist activities validates the need for strong cyberdefense, but does not justify tossing all the cybercrime and national security concerns into a cybersecurity blender and turning on the blades.

It’s only a proposal right now, but there are elements that cry out already for the kind of definition that was lacking from the Senate-CIA plan. Selecting a bullet point at random:

“All monitoring, collection, use, retention, and sharing of information are limited to protecting against cybersecurity threats…”

Well, one would hope.  But what does the protection entail, and how do we define the threats? These are not nit-picky questions, they are essential to information governance that does the intended tasks, and mitigates privacy invasion. Or whatever you prefer to call it.

Advertisements

The privacy apartheid: no money, no time, no education, adds up to no privacy.

When privacy was a natural state of affairs, protecting it required a set of window shades, and maybe a hedge between you and the neighbor.

Modern privacy is a commodity, and the price is staggering. I’s not just money. Privacy protection is really inconvenient now, and intellectually challenging – not always in a good way. It requires a combination of education, time, prosperity, and technical aptitude that’s rare in a single human being. If you’re deficient in two or more of those categories, welcome to the privacy apartheid. You’re the have-not.

Here’s a brief and partial survey of the cost of privacy.

On the practically free end, you can make sure your internet browsers run in in SSL mode by default. The cost is a bit of time. It requires a Google search and ability to download a plugin. You also need to understand when and why this will preclude internet access in certain circumstances, so you won’t freak out when it happens.

If you’re very serious and skilled, you can run your own email server. Hardcore privacy advocates recommend this as if it were a walk in the park. It requires equipment, dozens of hours to implement, and a great deal of technical aptitude to maintain, with numerous headaches guaranteed.

Recently, consumer-grade privacy-enhancing products have become available. Check out the Consumer Electronics Show, where for many years, gee-whiz products with the greatest privacy-invading potential have been the highlight. This year, a tiny space on the show floor features vendors of privacy, starting with signal-blocking cases for mobile devices ($69-$199, no technical proficiency required). Last Private Place featured a competing product for phones a few months back, the $80 Privacy Case.

Vysk is there, with a phone sleeve that has an encryption feature in the microphone to keep conversations private ($229). Using Vysk’s QS1 case requires a technical comfort level sufficient to activate the product’s privacy modes and download the subscription-based apps. Add a monthly charge of $9.99, plus at least an hour to experiment and understand settings and capabilities.

Virtual Private Network service provider PIA offers encrypted internet access on demand, with tech support. (Annual packge price $39.95). But it’s not simple. You need basic knowledge of how networks function and probably about three hours to understand and activate the service.

Privacy advocates also suggest staying off most social media. There’s always been a privacy bonus for avoiding gossipy neighbors, and there still is. But there’s also a professional penalty, because those gossips now call themselves LinkedIn and Facebook, and your competitors are present in droves. Opportunity cost: incalculable.

If you’re looking for cheap solutions, a hat and sunglasses may provide a defense against surveillance cameras. For a few bucks and a few minutes you can sew a couple of infrared LED lights into the hat, with a 9 volt battery to power them. Hacker lore says in certain lighting conditions, this will obscure your face from the cameras.

So you pay with money or you pay with time and know-how, or you pay with isolation. And no matter how you pay, no product exists to completely eradicate the ubiquitous privacy challenges that show up daily disguised as fun and convenience.

More embarrassing than a bad movie: Sony-shaming and Korea blaming

Troubling as it was to see the President of the United States lob questionable charges at a rogue regime and threaten proportional response – whatever that means – then jet off to Hawaii for a two-week holiday, it’s more troubling that in the face of evidence to the contrary, the FBI is stands by its speedy assessment that the North Koreans are to blame for the Sony attack.

Attribution is the most difficult task in analyzing cybercrime. Truly adept cybercrimals write purposeful miscues in their code to implicate others, and anyone can buy code that’s been used by someone else. There’s a virtual Wal-Mart for cybercriminals out there on the dark web. Right next door, there’s another storehouse of used malware code for free. And another, and another.

This week, a security research firm that’s studied the Sony breach since its announcement tossed cold water on the FBI’s North Korea theory. Some of the most respected names in the information security realm agree that the feds are barking up the wrong tree. Many of these experts doubted the state-actor theory even as President Obama made his pre-Christmas announcement blaming North Korea.

Three notable concerns arise from the President’s premature saber-rattling, and from federal insistence on pursuing the course based on deficient intelligence.

The first is obvious. Threatening North Korea for hostilities that can’t be reliably attributed to it could become another embarrassing moment in United States international relations. Moreover, if Kim Jong Un’s regime has the skill and resources to pull off the Sony attack, such saber-rattling could be dangerous. If the leader of the free world believes an adversary to be capable, unpredictable, and on the warpath, why poke it in the eye and then head for the beach?

Second, why the rush to judgment? It’s been documented by Stuxnet chronicler Kim Zetter that the initial messages to Sony from its attackers were demands for money, and had nothing to do with Sony’s movie featuring the assassination of Kim Jong Un. Not until December 8, when the media had repeatedly linked the attack to some North Korean grumbling about the film dating back to last summer did the attackers make their first public reference to it. In other words, the North Korea narrative may be a media invention, and may have clouded the motive of the real culprits, who appear to have wanted ransom, not censorship.

Note also that it wasn’t until someone threatened to blow up movie theaters, likely some capitalism-hating jackass of the variety that boast penetrating the networks of big companies for laughs, that the White House ventured an opinion. A cynic might suggest that a threat to movie theaters on Christmas Day is an opportunity to promote federal police power unlike any since the days after September 11.

Americans are gaining a rapid understanding of cybercrime, but it’s still murky to many. Some of these Americans, consumed by their own daily obligations, followed the story just closely enough to cite the insulting email remarks about Angelina Jolie, but did not grasp the dire effects of the attack on Sony. Some of them offered the theory on a recent radio talk show that Sony hacked itself, to generate publicity for its movie. What a fertile field in which to sow fear.

Finally, and in some ways most disturbing, the President conflated a straightforward business decision involving risk management with a sin against the First Amendment. Obama had plenty of company in this regard, with Hollywood heavies like George Clooney leading the charge, and commentators shepherding the general public to join in such a Sony-shaming harangue that the troubled company felt compelled to reverse itself after pulling the film from its scheduled debut.

Of course, foreign dictators have no standing to commit a First Amendment violation against U.S. citizens, but this became the mantra. Sony should not buckle to offshore censorship. (Like Facebook does?) Nobody defended Sony’s right to withhold speech, which is surely as embedded in the concept of free expression as is the right to disseminate ideas.

Sony, already smarting, and perhaps struggling for its very existence, was castigated for performing a fundamental risk assessment that measured the slight possibility of exploding movie theaters against the devastating consequences of such an occurrence, however unlikely. It was grossly unfair to paint Sony’s risk aversion as a slap in the face of the U.S. Constitution rather than a simple failure to devise a creative solution. Which it later did, under significant duress.

I wish (the Sony executives) had called me, the President said. No wonder they didn’t.

Five privacy measures the Nevada legislature should pass in 2015, Part I

1- Require public notice of license plate scanning, and set uniform standards for managing the data

Law enforcement agencies throughout Nevada are using license plate scanning technology. The scanners record every license plate number from every car they encounter, even at speeds up to 110 miles per hour, according to a proposal submitted by a vendor of car-mounted scanners.

Both local jurisdictions and the state have the devices. No uniform policies exist for storage and handling of the data, or for disclosure of records related to quantity of data captured, and whether it’s transmitted outside the state. These policies should be standardized across the state.

Citizens also have a right to know their vehicles are being tracked. The scans include GPS location data and other metadata that can be used to reconstruct activities and associations. Alerts to such technology should be posted in every jurisdiction where it’s used, on signs similar to those informing drivers that their speed is monitored by aircraft,  and should include a reference for more information.

2- Prohibit medical practices from scanning and storing driver’s licenses

Medical information is statistically among the most vulnerable to cybercrime and system breaches. The good news is, some insurance companies have apparently realized that scanning and storing driver’s licenses is a bad policy. The requirement is being loosened.

The bad news is that medical receptionists haven’t stopped demanding a copy the moment you enter the facility. You’re still forced to assert yourself if you want to keep your license out of their systems, perhaps even escalating the argument to the office manager. (Earlier this year, after losing one of these fights, I submitted questions in writing about the technical storage specifications and the rationale for retaining my license. Six weeks later I received a letter from the facility with the answer – “it’s not required and we’ll destroy the file if you’d like us to.”)

Why retain a document bearing a patient’s facial image and signature sample if you don’t have to? Why collect one more shred of personal information than necessary? This is an invitation to identity theft, and should be outlawed by statute.

3- Pass a resolution supporting federal reform of the Electronic Communications Privacy Act

NSA whistleblower Edward Snowden and Facebook founder Mark Zuckerberg were both toddlers when ECPA was written. That should convey the obsolescence of the federal law that controls government access to citizen communications, with no further explanation of the ways in which the nature and the volume of digital communication have changed, or the expansive government tactics employed to pursue it.

It’s conceivable the lame duck congress could pass ECPA reform before the 2015 legislative session convenes in Nevada. There are proposals with bipartisan support in Washington, and there’s a lot of pressure to act from the tech sector and privacy advocates. But in the event ECPA reform sits until next year, the Nevada legislature could go on the record with a resolution supporting reform. The American Legislative Exchange Council has a model resolution that can serve as a starting point, even if it’s not adopted wholesale.

4- Create a legislative subcommittee on privacy

In 2014, privacy discussions revolve in large part around technology. But privacy is an element in virtually every aspect of personal, family, and business life. The state legislature inserts itself into all of these areas, mostly without contemplation of how its actions affect privacy.

A subcommittee should be formed to review the implications of the dozens of bills each session that affect privacy.

The committee should also sponsor a bill requiring all government agencies at every level to review their own privacy policies and report back to the committee on data collection and privacy protections, with the relevant results compiled and posted online for accessibility to the public. Nevadans deserve to know what data is collected, where and how it is stored, and the purpose for which it’s used.

5- Codify a set of data handling and retention policies to protect student privacy

Public school students are subject to unprecedented data collection regarding their health, academic performance, disciplinary issues, and even their home life. The old joke about things that end up on your “permanent record” isn’t a joke anymore, it’s a reality.

Data collection on K-12 students, to the extent it can be controlled at the state level, should be reviewed and analyzed to determine which information is truly useful and which might be considered extraneous.

Stringent data security and privacy policies should protect students from criminals and commercial activity, but also protect the adults they will become from being haunted by the kids they once were. That means a vigorous schedule of data destruction should be part of the policy. The data should not outlive its educational necessity, and the longer it’s retained, the greater the odds it will leak. Young people change as they mature. Damaging details about their past should not become one more obstacle for them to overcome when they’re trying to conduct their adult lives.

California has just passed two laws that bear discussion and analysis. More about these statutes in a future post. They’re conceived to fill gaps in federal student privacy protections. They’re not completely in sync with the sentiments above, but they’re a demonstration of control at the state level.

Nevada has no business mandating a remote kill switch for cellphones

The Silver State is still basking in praise uttered last month by Tesla founder Elon Musk, who dubbed Nevada a “get things done” state. What a shame if we followed the business-friendly feats that persuaded Tesla to locate its battery plant here with legislation signaling the opposite attitude – – like requiring cellphones sold in Nevada to have remote kill capability.

The 2015 Nevada legislature will contemplate the kill switch mandate as a way to curb smartphone theft, according to a USA Today report.

Why would a state striving to reinvent itself as a technology haven insert itself into design standards for a globally-distributed tech product? Surely not because Nevada wants to emulate California, where a new kill switch law goes into effect next July.

Perhaps the advocates anticipate the reliable political boost that comes from supporting a crime bill, even when the resulting law is largely symbolic. In this case, symbolic because Apple, Google and Microsoft, which together command nearly 100 percent of the smartphone market, are already equipping their devices with remote anti-theft features, or preparing to do so in their next versions. For the tiny remainder of the market, mostly Blackberry users, the remote-kill feature has existed for years.

Beyond the politics, when governments manipulate the architecture of communication equipment, privacy and civil rights implications must be considered. State-mandated remote access provides an opening for abuse, as demonstrated by federal backdoor requirements that paved the way for spying and cybercrime.

The new California law says service to a phone can be halted only by “an authorized user.” But guess who is “authorized” besides the owner of the phone? Government at any level, right down to the dog catcher. The California statute incorporates a Public Utilities Commission Code section allowing a “government entity” – – broadly defined – – to get a court order requiring  the provider to cut service for a “reasonably necessary” period.

"Governmental entity" means every local government, including a city, 
county, a transit, joint powers, special, or other district, the state, 
and every agency, department, commission, board, bureau, or other 
political subdivision of the state, or any authorized agent thereof.

Based on this language, a bus supervisor or a fire inspector could conceivably be an authorized agent.

The circumstances allowing a government-ordered shutdown include probable cause that the phone is being used for an unlawful purpose. Or that without intervention, there is jeopardy to public health, safety or welfare.

Civil libertarians are going on the record with warnings that peaceful protesters might be shut down, or that law enforcement might misuse the remote kill. And it’s hardly ridiculous to wonder what public welfare threats involving individual cellphones government might perceive.

Sections authorizing government use will be the consequential portion of any Nevada kill switch law, given the industry’s already clearly-stated intention to provide anti-theft technology. Observers of Nevada’s 2015 session should watch closely. Or, maybe Nevada’s stated goal of luring tech will prevail, and the legislature will spend its time on other matters.

Not without a fight: Keeping the cops out of your iPhone

Law enforcement has its boxer shorts in a bunch following Apple’s announcement that iPhone 6 and iOS 8 devices will be protected from the probing eyes of police. Google, too, will complicate criminal investigations by changing its optional Android encryption feature to automatic.

Federal officials are “bracing for a confrontation with Silicon Valley” over encryption-by-default, reports the Wall Street Journal. A former FBI official called the new privacy feature “outrageous,” and likened it to an invitation for criminals to use the products.

Lame-duck Attorney General Eric Holder piled on this week, denouncing encryption and other privacy tools to the Global Alliance Conference Against Child Sexual Abuse. Holder wants tech companies to provide back doors into their products. To save the children, of course, whose exploitation at the hands of perverts is second only to their exploitation by the political class.

Criminals are certain to use encrypted smart phones, but so are millions of law-abiding citizens who’d like to send private messages to their physicians, their business prospects, and their mistresses. Some of them might snap a nude selfie or two. Everyone has perfectly legal secrets to keep.

Bravo to Apple and Google for their late, if grudging, arrival to the privacy party. In this arena, it should be noted that Apple is following, not leading. The privacy community celebrated the release earlier this year of the encrypted Blackphone, an Android adaptation with amped-up security and privacy. Blackphone hails from Switzerland. The company was located there for precisely the reasons outlined above, by an American technology entrepreneur who champions privacy.

Law enforcement’s fight for continued easy access to devices, cloud storage, and business-grade communication tools compromises the American economy, as well as personal privacy. Global companies have reservations about buying products that are open to U.S. government fishing expeditions. Just one more reason for American innovators to move offshore.

And this squabble ignores the valuable role of encryption in fighting cybercrime. Both personal and business data need to be properly encrypted. The fight to secure your snapshots is one thing. But guarding embedded systems that control water and power facilities is the next level.  As is the safety of intellectual property that’s the lifeblood of innovation. How about keeping criminals and terrorists out of the banking networks and the airport control towers?

Holder and company are apparently willing to expose a data pool far greater than the rate at which your next-door neighbor consumes dirty movies. (Evidence of which, incredibly enough, is sometimes seized by prosecutors grasping for dubious rape convictions, in order to establish predatory intent.)

Police have many avenues to truly relevant evidence. Third-party billing records and court orders requiring legitimate suspects to turn over their encryption keys offer a slower, but better-considered route. More generally, the surveillance deck is stacked heavily in law enforcement’s favor.

It’s time to nix the notion that our privacy is disposable because there’s a guy waiting behind every bush to molest children and grab college girls, or that everyone who carries more than one cell phone must be a drug dealer. Such an allusion to multiple phones was made by none other than Chief Justice John Roberts, even as he was about to author this summer’s Supreme Court decision requiring warrants for cell phone searches.

It’s also time to hit back at the rhetorical last refuge of scoundrels. “What if it was your daughter?” – a wretched query hurled regularly at privacy advocates, as if only those unscathed by crime should appreciate principles that protect all citizens from intrusions into their personal affairs.

These tired crime-fighting tropes need a rest, and so does the assumption that every technological tool should be an automatic enhancement to police power. These are stops along a path that leads to fear and mistrust of the police. That’s not a desirable outcome for any of us, as we saw in Ferguson, Missouri.

Retro privacy concerns: bed pans and hospital gowns

I couldn’t see her face on the other side of the curtain, but I was pretty well acquainted with my roommate after a few hours. We were friends, kind-of, by the time the nurse helped me position myself over a bedpan. Propping myself up on three of my all-fours, because one foot is severely injured, I was momentarily struck with a rush of old-school emotion about privacy.

Hours earlier, three men had undressed me in the emergency room. There was no institutionally-prescribed mock concern for my emotional comfort. In short order, the male ER personnel had pulled off the pants, T-shirt, and sports bra I’d been wearing on my bike ride, and put me into a gown. No-nonsense, down-to-business, get-‘er-done. It didn’t feel strange because they didn’t make it strange.

So now, with a nurse and another patient in the room, the self-described privacy advocate needs to go potty.

“Suddenly, I’ve got a shy bladder,” I told the nurse, who said she’d leave the room for a moment to give me some privacy. It didn’t help, because it wasn’t about her.

Dozens of times a month, I contemplate twenty-first century privacy. Procedures, protections, violations, bad laws, and the dumb ways people expose themselves for the sake of convenience or socializing.

So often and so automatically do I think about these things that my brain’s been retrained. It translates “privacy” differently than it did when I was growing up. Knowing there are people are watching in real time as my body functions, and looking at my private parts —  once, this would have been a 10 on the privacy invasion scale. It barely registers now. Seems quaint, even.

Privacy invasion is made of different things now.  Video surveillance and invasive authority. Businesses blithely collecting info without considering the consequences. Free services that steal your soul with a thumbs-up and a smile. The mantra that says you have nothing to worry about, because you’ve done nothing wrong, which is of course, patently false. And the big lie, repeatedly told and never questioned: “We take privacy very seriously.” Ha.

So these people who work in clinical settings, helping the ill and patching up the occasional injured bicyclist — like the creeps who steal nude photos from the cloud, the medical professionals see lots of naked body parts in a given week. Like the social media moguls, they stay emotionally detached from their subjects, and they do what they do for money. Like mindless good-government types, they expose you in order to help you.

But their invasion is fleeting, it’s got authentic purpose, and they’re in the room, looking you in the face, with full accountability while they do it. Those are some of the reasons it’s different, and the reasons float through my head, and in the middle of the night I wake up, and notice lights on the patient cams are lit.  It might be the pain medication they’ve pumped into me, but tonight I don’t care.