Nevadans seek state-mandated election audits

You’ll seldom hear a more vigorous defense of a state-run information system than the one mounted by election officials when voters challenge the legitimacy of an election. So it was earlier this week in the Nevada Assembly committee that vets election bills, where a group called the Citizen Task Force for Voters Rights showed up to promote AB209.

The bill would require the counties to establish an audit trail for each process involved in conducting an election. Voter registrars from across the state stepped up to protest the cost of implementing the measure, and to reassure lawmakers that their current practices are solid. Clark County’s Joe Gloria, as designated spokesman for his colleagues, touted their performance, noting that Nevada has received national recognition for election integrity.

The problem, says the task force, is that election departments are their own auditors. They investigate any reported irregularities, and not surprisingly, they find no fault in their own system. This wouldn’t fly for casinos or banks, and the task force wants Nevada’s elections subjected to external audits by fraud examiners, same as other high-stakes sectors.

Citizen Task Force for Voters Rights started as a group of voters seeking answers after a phantom candidate took 22.18 percent of the votes in a 2014 Republican primary contest. A man named Mike Monroe had captured 5,392 votes in Congressional District 4 without conducting a campaign. He had no financial backers, and never made appearances or walked neighborhoods. Their search for Monroe turned up no registered voter who knew him or voted for him. His supposed address was a vacant building.

Monroe’s voter turnout was all the more astonishing because his two opponents, then-state legislator Crescent Hardy and Las Vegas activist Niger Innis, conducted energetic campaigns and generated significant press coverage. Typically, anemic candidates facing better-known names would capture between 2 and 7 percent, according to task force research.

Since that election, task force members say they’ve devoted hundreds of hours to investigating election procedures in the counties encompassed by CD 4. They’ve reviewed materials, interviewed people who’ve worked at the polls, and researched the ways elections can be compromised.

They’ve compiled a list of election system vulnerabilities starting with the absence of audit trails and chain of custody records. Add weak voting machine security, training deficiencies, insufficient background checks, and undisciplined transportation procedures. The list also includes “failure to create a security culture.”

Some of the task force claims have years’ worth of anecdotal support from observers and polling place workers.

Election managers are passionate about their work, and nobody suggests they don’t take their task seriously. In the days since the AB209 hearing, two election officials have offered informal assessments of Nevada’s election system security. One described it as “bulletproof” and the other supports the assertion that it’s impervious to criminal interference.

To a reporter who’s covered voting security issues for more than a decade, they seem to be in denial. It was somewhat understandable in 2004, when electronic information management was still evolving. In 2015, they appear willfully blind to reality. No system is bulletproof. Sony wasn’t bulletproof. Athem Blue Cross, J.P. Morgan, and the U.S. Defense Department were not bulletproof. All of those entities spend millions more on security than budget-constrained Nevada election departments.

Consider also our reliance on minimally-trained election day volunteers, and the central role of the much-maligned Seqouia voting machines. It’s unnerving, even insulting, to expect intelligent taxpayers to believe that nothing can possibly go wrong.

Some lawmakers on the Assembly committee mirror the official demeanor, making it clear they favor blind reliance on the system over weighing thoughtful criticism from skeptical voters. Those legislators also reflect the tendency of election managers to blame questionable occurrences on the voters.

“Weird things happen (in primary elections),” said one Assemblyman, adding that primary voters are inclined to cast irrational votes.

The Citizen Task Force may struggle to get a second hearing.

Why Hillary’s State Department email and the Clark County School District email should have similar protection

Hillarymail, Part I: The data path to government computer networks should be secure

Nevadans should take special note of the revelations about Former Secretary of State Hillary Clinton’s email account, which she reportedly managed from a server in her New York home while she was serving in the Obama Administration.

Mrs. Clinton is being criticized for three reasons, including her astonishing presumption that rules don’t apply to her. The other two reasons are pertinent to Nevada’s own unsettled questions about the difference between email content created by public employees, which should be part of the public record, and email addresses assigned to public employees, which should not.

The content of Hillary Clinton’s State Department email, in its entirety, should belong to the taxpayers. And it would, if she played by the rules. As it stands, we’ll never know if we’ve seen the complete archive. Her email address, on the other hand, should belong to the United States Federal Government on behalf of the taxpayers. And it would, if she played by the rules. Hillary’s email address should have existed behind a layered, military-grade security protocol. Would it be safe there from hostile activity? We can only hope, but that’s the intention.

Why does this distinction seem obvious in the face of national security implications, but not when the security of Nevada school children and their teachers is implicated?

The Clark County School District made the right call, with no apparent understanding of how right it was, when it denied public records requests for teacher email addresses. The district said that sharing the addresses with the Nevada Policy Research Institute (and other requesters) would cause “countless businesses and organizations to continuously solicit district teachers through their work email.” In other words, the district thought making the email addresses public would create a nuisance.

NPRI then sued for the email database. The district’s motion to dismiss the complaint didn’t go far enough, nor was it sufficiently precise in claiming that broad use of teacher emails by outsiders would “frustrate” the purpose of the district’s communications network.

“Teachers would be forced to spend time sorting through phishing scams, computer viruses, and other unsolicited spam email,” the district asserted, if “organizations like (NPRI), as well as internet marketing companies, hackers and anyone else who may benefit from thousands of active email accounts…” were given access.  The additional traffic would “clog the servers and the computer systems, harming the public in the process.”

The harm envisioned by the district was inconvenience and misspent time due to commercial targeting of teachers. District officials apparently did not grasp the potential for malicious penetration causing catastrophic system failure. Neither did it link “phishing scams” and “hackers” with harm to student privacy. We’ve since learned from a separate conflict over academic standards that Nevada’s school districts are creating extensive student dossiers containing hundreds of personal, non-academic data points. What potential harms might come from an incursion into those information troves?

Email addresses are a data path, leading first to people, then to systems. Hostile nations might have used Hillary Clinton’s data path to glean State Department secrets. The math teacher’s data path could offer access to a valuable bundle of assets held by the nation’s fifth largest school district. Criminals could find payroll records, stalk students, or blackmail parents and administrators. The threats to these systems are utterly analogous.

There is compelling state interest in protecting government information systems at all levels. There’s no outcry in Nevada suggesting that school teachers are unreachable by the people who need to reach them. Tight system security does not constitute lack of transparency. We’ll soon see if the Supreme Court of Nevada agrees.

Hillarymail, Part II: Content is public record

The primary relevance of “HDR at Clintonemail dot com,” aside from its eloquent expression of presumed privilege, is its deviation from national security standards. Any omissions from the public archive can be corrected with the efforts of a diligent press, or a congressional investigation, or a special prosecutor if it comes to that.

The great (and not-so-great) thing about email is that it multiplies like bunnies. Anyone who destroys official email will live to regret it. Somebody, somewhere, will have the means, motive, and opportunity to resurrect regrettable messages.

State Department email messages, school district email messages, and all other email messages on taxpayer-funded systems are public records, and should be turned over to the public, period.

Hillarymail, Part III: Privacy and the infuriating double standard

Of all people who should realize that public life brings a diminished expectation of privacy, you’d think Hillary Clinton would top the list. Time will tell if it’s Hillary who will validate the infamous utterance of Google Chief Eric Schmidt: “If you have something you don’t want anyone to know, maybe you shouldn’t be doing it.”

Hillarymail, whether a scandal or a screw-up, is a vivid reminder that Washington’s top tier has a double standard when it comes to privacy. They want theirs, but they’re willing and eager to be part of the data-sucking machine that robs you of yours.

It’s also a great chance for the taxpayers to demand that our privacy, not theirs, should be paramount. On that front, the silence so far is deafening.

Nevada Legislature needs a moratorium on data collection

It’s hard not to seem like a luddite, a naysayer, or a nut while assessing the new and expanded uses of technology proposed in the Nevada legislature.

And what a shame to feel uneasy, not appreciative for a proposed DMV database that would help police locate family members quickly when someone is rushed to the hospital after an accident.

County coroners and the Department of Public Safety are supporting a bill to create an emergency contact registry. It would save money and man hours when they’re looking for next of kin. Who wouldn’t jump on board? Why apply the brakes to a plan that could help loved ones arrive in time to make critical medical decisions, or spare them agonizing hours wondering what’s happened to someone who’s unconscious or dead?

But privacy advocates identified holes in the provision of SB3 that describes the management of this personal data. They asked for tighter guidelines.

What could go wrong with this database? Maybe nothing. Depends on who’s minding the data, and how.

But suppose a criminal gets access, and calls next-of-kin to report a fake accident. He prods panicky relatives for personal information to get proper emergency care for the “victim” — insurance policy numbers, physician’s name, and prescription drug information. Family members comply, desperate to help. This is not far-fetched. Similar schemes are rampant, and profitable.

Take comfort in knowing the registry would be optional. Nevadans who love the idea more than they fear a security breach would opt in.

Carson City is awash in bills conceived to make life safer or more convenient by collecting more personal information, or by inducing Nevadans to engage with the state’s information systems. Many have useful goals, but also provide fertile ground for unintended consequences.

In another example, an election procedures bill would allow election departments to send sample ballots by email to voters who opt in. Voter registrars say it will save money mailing paper ballots, and political activists believe it will stimulate civic involvement.

Proponents submitted a “privacy amendment” that puts the burden on voters to submit written requests to keep their email addresses private. Rather than provide privacy by default, the state will require voters who opt in for email ballots to subsequently opt out of a public listing.

What could go wrong? Depends on the sophistication of the data custodians, the technical rigor of their system, and the savvy of the citizens.

Confoundingly, while making some airy statements that raise questions about current security on the Clark County election website, Voter Registrar Joe Gloria also testified that sample ballots are available online. So why not encourage voters to download an electronic version, rather than solicit email addresses?

All of this should give pause to lawmakers. Their confidence should be conditional on absolute clarity by the data collector. And every goal should be accomplished in the least intrusive manner.

But some members of the elections committee gushed over the sheer gee-whiz-we’re-digital factor. Others were no doubt persuaded by the cost savings. Clark County alone would save $1,670 for every thousand voters who choose email over paper mailing.

If you believe your state-sponsored data custodians have privacy and security locked down, recall that we recently saw the inadvertent exposure of social security numbers belonging to 114 retired judges by an entity with fiduciary responsibility. PERS, the Public Employee Retirement System, emailed a spreadsheet with unencrypted social security numbers in response to a public information request. The breach was reported by the recipient, the Nevada Policy Research Institute, which had sought the data for a study of pensions.

It’s a stunning mistake. Although no names accompanied the data, and the recipient behaved responsibly, things might have been worse. Identities can be reverse engineered using a couple of the other data points that appeared on the spreadsheet.

In 2015, government and the private sector are both lagging in their grasp of how to protect privacy and security. There’s even less awareness of where potential danger might lie.

“Because we can” is not a good reason to expand data gathering by the state. Nevada might benefit from a two-year moratorium on such initiatives while public understanding catches up with technology.

Privacy Potpourri: Homeland Security inserts itself into local sex trade, and more

Privacy headlines popped in January like champagne corks on New Year’s Eve. Here are a few highlights, starting in Reno, where nine hapless SOBs were snagged by a law enforcement team including agents of the U.S. Department of Homeland Security, for attempting to purchase unspecified sexual services on the street.

Recall that the mission of Homeland Security was supposed to be preventing actual breaches of homeland security. The DHS website gives only the barest hint of the mission creep that has it preventing transactional sex between Reno street hookers and their prospective customers.

Here’s the department’s “vital mission.”

“…to secure the nation from the many threats we face. This requires the dedication of more than 240,000 employees in jobs that range from aviation and border security to emergency response, from cybersecurity analyst to chemical facility inspector. Our duties are wide-ranging, but our goal is clear – keeping America safe.”

The Sparks Crime Suppression Team, apparently finding no crime to suppress in its own city, was on hand to help Reno PD and the feds with the six-hour sting, as were workers from the Washoe County Health Department, who performed mandatory HIV tests. Florence Nightingale must be smiling in heaven.

The first weeks of 2015 also revealed that at least 50 American law enforcement agencies have been secretly using a hand-held radar device to perform surveillance of human activity inside of homes, despite a U.S. Supreme Court ruling requiring a warrant for similar searches that rely on thermal readers to detect heat behind the walls of buildings.

The radar devices were designed for military use, to spot human presence inside buildings by detecting movements as subtle as breathing, reports USA Today.

The 10th Circuit Court of Appeals upheld a search by U.S. Marshals using the device, but noted that it raises “many questions.”

Speaking of invasive technology with ostensibly benign intentions, the National Science Foundation is paying a professor $50,000 to develop a facial recognition app that monitors student attendance at college lectures. The developer teaches at Missouri University of Science and Technology. He uses his smart phone to take a video of students in the lecture hall. His finished product will automatically take attendance by applying a facial recognition algorithm to the video.

The rationale for this NSF investment is that attendance is the best predictor of graduation rates, and that students who don’t graduate are less able to pay off their student loan debt. No word on the controlling classroom policy when an adult student declines to have his image captured on his prof’s phone.

The private sector has a solution, too. The Class120 app costs $199. Installed on the student’s own smartphone, it overlays geolocation data with campus maps, notifying parents if the phone is not in the right classroom at the right time of day. Nothing wrong with that, if the parents and the students agree on it. But here’s the kicker from the Wall Street Journal:

“As online interactions have grown, schools have realized they have a trove of new data to look at, such as how much a student is accessing the syllabus, taking part in online discussions with classmates and reading assigned material. Such technology “shows faculty exactly where students are interacting outside as well inside the classroom…”

Then there’s the insurance company that promised a discount to drivers who allow it to digitally monitor driving habits. Progressive Insurance has distributed two million dongles that port into the OBD (on-board diganostics) console, which is the electronic communication center for the moving components of the car. The dongle monitors brakes, acceleration and other readings, including mileage and time of day, creating a record of the driver’s vehicle usage.

Seems that a skilled hacker can use the the dongle to get into the vehicle’s core systems, according to a Forbes interview with security researcher Corey Thuen, who discovered that he could unlock doors and gather information about his truck’s engine by hooking up his laptop to the dongle. Says Theun:

“It (the dongle) has no secure boot mechanism, no cellular communications authentication, and uses no secure communications protocols, possibly putting the lives of people inside the vehicle in danger.”

Safety implications aside, the driver information itself is vulnerable. It would be a piece of cake to intercept the dongle’s transmissions to Progressive, and to steal, erase, or alter the data, with potentially serious and possibly irrevocable consequences for the driver.  Happy New Year!

.

Feinstein versus the CIA: Incident reveals hypocrisy and incompetence

What do you call it when someone rifles through your computer files without your knowledge? Employers call it monitoring, and civil libertarians call it surveillance. U. S. Senator Dianne Feinstein calls it a “separation of powers” issue.

So Feinstein described her headline-grabbing dustup last year with the Central Intelligence Agency, in which the agency spied on staffers of the Senate Intelligence Committee while the committee was investigating certain interrogations by the CIA.

The squabble started after the CIA set up a dark network in an out-of-the-way location for Senate investigators to view specified CIA documents. When the agency realized it had not sufficiently blocked access to other files that should have remained hidden, five CIA staffers tried to assess the damage by snooping through Intelligence Committee computer files, and even reading email messages. Feinstien went public when she found out, blasting the agency for “violating separation of powers.”

While the senator rooted her fury in the CIA’s apparent intention to undermine her investigation, she discussed the incident in the language of inter-branch conflict, rather than call it what it was — spying by a spy agency. This suggests she stands by her vilification of Edward Snowden and her defense of the programs he leaked, while claiming that her own digital affairs should be off-limits to the prying eyes of alphabet soup agencies. In other words, Constitutional principles apply differently to powerful elected officials than to regular citizens.

Feinstein repeated the separation-of-powers charge when the story resurfaced this week, after a CIA investigative panel cleared the agency’s five snoops of wrongdoing. The agency’s internal report found its employees’ hacking activities had been “clearly inappropriate,” but were not cause for discipline.

But the report revealed something else: There were very basic failures of information governance that enabled the senate committee to grab documents it shouldn’t have, and that subsequently justified clearing the CIA of bad faith or malfeasance.

The Washington Post reports: “… the accountability review board concluded that the CIA-Senate arrangement was so convoluted that the panel could find no clear rules on how the shared computer system was to be run, let alone whether any rules had been violated.”

Notwithstanding the novelty of the “arrangement,” please note that the world’s most powerful spy agency and the nation’s most powerful legislative body abdicated well-established standards that would have determined in advance the protocol for a digital data transaction. This planning is fundamental to digital security and privacy.

Contemplating the agency’s sloppy information governance, especially given what was at stake, should lead to serious doubts about extending to any federal agency more authority to collect, store, or probe the digital records of Americans. For anyone who’s still not clear that the federal government is a poor steward of information, please do a careful reading of the above linked story in the Post.

That brings us (briefly) to President Obama’s 2015 cybersecurity initiatives. The president’s package would facilitate continued government access to citizen communications, and would snag security researchers, journalists, lawyers, and others in a net cast much too wide for cybercriminals. A recent surge in global terrorist activities validates the need for strong cyberdefense, but does not justify tossing all the cybercrime and national security concerns into a cybersecurity blender and turning on the blades.

It’s only a proposal right now, but there are elements that cry out already for the kind of definition that was lacking from the Senate-CIA plan. Selecting a bullet point at random:

“All monitoring, collection, use, retention, and sharing of information are limited to protecting against cybersecurity threats…”

Well, one would hope.  But what does the protection entail, and how do we define the threats? These are not nit-picky questions, they are essential to information governance that does the intended tasks, and mitigates privacy invasion. Or whatever you prefer to call it.

The privacy apartheid: no money, no time, no education, adds up to no privacy.

When privacy was a natural state of affairs, protecting it required a set of window shades, and maybe a hedge between you and the neighbor.

Modern privacy is a commodity, and the price is staggering. I’s not just money. Privacy protection is really inconvenient now, and intellectually challenging – not always in a good way. It requires a combination of education, time, prosperity, and technical aptitude that’s rare in a single human being. If you’re deficient in two or more of those categories, welcome to the privacy apartheid. You’re the have-not.

Here’s a brief and partial survey of the cost of privacy.

On the practically free end, you can make sure your internet browsers run in in SSL mode by default. The cost is a bit of time. It requires a Google search and ability to download a plugin. You also need to understand when and why this will preclude internet access in certain circumstances, so you won’t freak out when it happens.

If you’re very serious and skilled, you can run your own email server. Hardcore privacy advocates recommend this as if it were a walk in the park. It requires equipment, dozens of hours to implement, and a great deal of technical aptitude to maintain, with numerous headaches guaranteed.

Recently, consumer-grade privacy-enhancing products have become available. Check out the Consumer Electronics Show, where for many years, gee-whiz products with the greatest privacy-invading potential have been the highlight. This year, a tiny space on the show floor features vendors of privacy, starting with signal-blocking cases for mobile devices ($69-$199, no technical proficiency required). Last Private Place featured a competing product for phones a few months back, the $80 Privacy Case.

Vysk is there, with a phone sleeve that has an encryption feature in the microphone to keep conversations private ($229). Using Vysk’s QS1 case requires a technical comfort level sufficient to activate the product’s privacy modes and download the subscription-based apps. Add a monthly charge of $9.99, plus at least an hour to experiment and understand settings and capabilities.

Virtual Private Network service provider PIA offers encrypted internet access on demand, with tech support. (Annual packge price $39.95). But it’s not simple. You need basic knowledge of how networks function and probably about three hours to understand and activate the service.

Privacy advocates also suggest staying off most social media. There’s always been a privacy bonus for avoiding gossipy neighbors, and there still is. But there’s also a professional penalty, because those gossips now call themselves LinkedIn and Facebook, and your competitors are present in droves. Opportunity cost: incalculable.

If you’re looking for cheap solutions, a hat and sunglasses may provide a defense against surveillance cameras. For a few bucks and a few minutes you can sew a couple of infrared LED lights into the hat, with a 9 volt battery to power them. Hacker lore says in certain lighting conditions, this will obscure your face from the cameras.

So you pay with money or you pay with time and know-how, or you pay with isolation. And no matter how you pay, no product exists to completely eradicate the ubiquitous privacy challenges that show up daily disguised as fun and convenience.

More embarrassing than a bad movie: Sony-shaming and Korea blaming

Troubling as it was to see the President of the United States lob questionable charges at a rogue regime and threaten proportional response – whatever that means – then jet off to Hawaii for a two-week holiday, it’s more troubling that in the face of evidence to the contrary, the FBI is stands by its speedy assessment that the North Koreans are to blame for the Sony attack.

Attribution is the most difficult task in analyzing cybercrime. Truly adept cybercrimals write purposeful miscues in their code to implicate others, and anyone can buy code that’s been used by someone else. There’s a virtual Wal-Mart for cybercriminals out there on the dark web. Right next door, there’s another storehouse of used malware code for free. And another, and another.

This week, a security research firm that’s studied the Sony breach since its announcement tossed cold water on the FBI’s North Korea theory. Some of the most respected names in the information security realm agree that the feds are barking up the wrong tree. Many of these experts doubted the state-actor theory even as President Obama made his pre-Christmas announcement blaming North Korea.

Three notable concerns arise from the President’s premature saber-rattling, and from federal insistence on pursuing the course based on deficient intelligence.

The first is obvious. Threatening North Korea for hostilities that can’t be reliably attributed to it could become another embarrassing moment in United States international relations. Moreover, if Kim Jong Un’s regime has the skill and resources to pull off the Sony attack, such saber-rattling could be dangerous. If the leader of the free world believes an adversary to be capable, unpredictable, and on the warpath, why poke it in the eye and then head for the beach?

Second, why the rush to judgment? It’s been documented by Stuxnet chronicler Kim Zetter that the initial messages to Sony from its attackers were demands for money, and had nothing to do with Sony’s movie featuring the assassination of Kim Jong Un. Not until December 8, when the media had repeatedly linked the attack to some North Korean grumbling about the film dating back to last summer did the attackers make their first public reference to it. In other words, the North Korea narrative may be a media invention, and may have clouded the motive of the real culprits, who appear to have wanted ransom, not censorship.

Note also that it wasn’t until someone threatened to blow up movie theaters, likely some capitalism-hating jackass of the variety that boast penetrating the networks of big companies for laughs, that the White House ventured an opinion. A cynic might suggest that a threat to movie theaters on Christmas Day is an opportunity to promote federal police power unlike any since the days after September 11.

Americans are gaining a rapid understanding of cybercrime, but it’s still murky to many. Some of these Americans, consumed by their own daily obligations, followed the story just closely enough to cite the insulting email remarks about Angelina Jolie, but did not grasp the dire effects of the attack on Sony. Some of them offered the theory on a recent radio talk show that Sony hacked itself, to generate publicity for its movie. What a fertile field in which to sow fear.

Finally, and in some ways most disturbing, the President conflated a straightforward business decision involving risk management with a sin against the First Amendment. Obama had plenty of company in this regard, with Hollywood heavies like George Clooney leading the charge, and commentators shepherding the general public to join in such a Sony-shaming harangue that the troubled company felt compelled to reverse itself after pulling the film from its scheduled debut.

Of course, foreign dictators have no standing to commit a First Amendment violation against U.S. citizens, but this became the mantra. Sony should not buckle to offshore censorship. (Like Facebook does?) Nobody defended Sony’s right to withhold speech, which is surely as embedded in the concept of free expression as is the right to disseminate ideas.

Sony, already smarting, and perhaps struggling for its very existence, was castigated for performing a fundamental risk assessment that measured the slight possibility of exploding movie theaters against the devastating consequences of such an occurrence, however unlikely. It was grossly unfair to paint Sony’s risk aversion as a slap in the face of the U.S. Constitution rather than a simple failure to devise a creative solution. Which it later did, under significant duress.

I wish (the Sony executives) had called me, the President said. No wonder they didn’t.

Five privacy measures the Nevada legislature should pass in 2015, Part I

1- Require public notice of license plate scanning, and set uniform standards for managing the data

Law enforcement agencies throughout Nevada are using license plate scanning technology. The scanners record every license plate number from every car they encounter, even at speeds up to 110 miles per hour, according to a proposal submitted by a vendor of car-mounted scanners.

Both local jurisdictions and the state have the devices. No uniform policies exist for storage and handling of the data, or for disclosure of records related to quantity of data captured, and whether it’s transmitted outside the state. These policies should be standardized across the state.

Citizens also have a right to know their vehicles are being tracked. The scans include GPS location data and other metadata that can be used to reconstruct activities and associations. Alerts to such technology should be posted in every jurisdiction where it’s used, on signs similar to those informing drivers that their speed is monitored by aircraft,  and should include a reference for more information.

2- Prohibit medical practices from scanning and storing driver’s licenses

Medical information is statistically among the most vulnerable to cybercrime and system breaches. The good news is, some insurance companies have apparently realized that scanning and storing driver’s licenses is a bad policy. The requirement is being loosened.

The bad news is that medical receptionists haven’t stopped demanding a copy the moment you enter the facility. You’re still forced to assert yourself if you want to keep your license out of their systems, perhaps even escalating the argument to the office manager. (Earlier this year, after losing one of these fights, I submitted questions in writing about the technical storage specifications and the rationale for retaining my license. Six weeks later I received a letter from the facility with the answer – “it’s not required and we’ll destroy the file if you’d like us to.”)

Why retain a document bearing a patient’s facial image and signature sample if you don’t have to? Why collect one more shred of personal information than necessary? This is an invitation to identity theft, and should be outlawed by statute.

3- Pass a resolution supporting federal reform of the Electronic Communications Privacy Act

NSA whistleblower Edward Snowden and Facebook founder Mark Zuckerberg were both toddlers when ECPA was written. That should convey the obsolescence of the federal law that controls government access to citizen communications, with no further explanation of the ways in which the nature and the volume of digital communication have changed, or the expansive government tactics employed to pursue it.

It’s conceivable the lame duck congress could pass ECPA reform before the 2015 legislative session convenes in Nevada. There are proposals with bipartisan support in Washington, and there’s a lot of pressure to act from the tech sector and privacy advocates. But in the event ECPA reform sits until next year, the Nevada legislature could go on the record with a resolution supporting reform. The American Legislative Exchange Council has a model resolution that can serve as a starting point, even if it’s not adopted wholesale.

4- Create a legislative subcommittee on privacy

In 2014, privacy discussions revolve in large part around technology. But privacy is an element in virtually every aspect of personal, family, and business life. The state legislature inserts itself into all of these areas, mostly without contemplation of how its actions affect privacy.

A subcommittee should be formed to review the implications of the dozens of bills each session that affect privacy.

The committee should also sponsor a bill requiring all government agencies at every level to review their own privacy policies and report back to the committee on data collection and privacy protections, with the relevant results compiled and posted online for accessibility to the public. Nevadans deserve to know what data is collected, where and how it is stored, and the purpose for which it’s used.

5- Codify a set of data handling and retention policies to protect student privacy

Public school students are subject to unprecedented data collection regarding their health, academic performance, disciplinary issues, and even their home life. The old joke about things that end up on your “permanent record” isn’t a joke anymore, it’s a reality.

Data collection on K-12 students, to the extent it can be controlled at the state level, should be reviewed and analyzed to determine which information is truly useful and which might be considered extraneous.

Stringent data security and privacy policies should protect students from criminals and commercial activity, but also protect the adults they will become from being haunted by the kids they once were. That means a vigorous schedule of data destruction should be part of the policy. The data should not outlive its educational necessity, and the longer it’s retained, the greater the odds it will leak. Young people change as they mature. Damaging details about their past should not become one more obstacle for them to overcome when they’re trying to conduct their adult lives.

California has just passed two laws that bear discussion and analysis. More about these statutes in a future post. They’re conceived to fill gaps in federal student privacy protections. They’re not completely in sync with the sentiments above, but they’re a demonstration of control at the state level.

Nevada has no business mandating a remote kill switch for cellphones

The Silver State is still basking in praise uttered last month by Tesla founder Elon Musk, who dubbed Nevada a “get things done” state. What a shame if we followed the business-friendly feats that persuaded Tesla to locate its battery plant here with legislation signaling the opposite attitude – – like requiring cellphones sold in Nevada to have remote kill capability.

The 2015 Nevada legislature will contemplate the kill switch mandate as a way to curb smartphone theft, according to a USA Today report.

Why would a state striving to reinvent itself as a technology haven insert itself into design standards for a globally-distributed tech product? Surely not because Nevada wants to emulate California, where a new kill switch law goes into effect next July.

Perhaps the advocates anticipate the reliable political boost that comes from supporting a crime bill, even when the resulting law is largely symbolic. In this case, symbolic because Apple, Google and Microsoft, which together command nearly 100 percent of the smartphone market, are already equipping their devices with remote anti-theft features, or preparing to do so in their next versions. For the tiny remainder of the market, mostly Blackberry users, the remote-kill feature has existed for years.

Beyond the politics, when governments manipulate the architecture of communication equipment, privacy and civil rights implications must be considered. State-mandated remote access provides an opening for abuse, as demonstrated by federal backdoor requirements that paved the way for spying and cybercrime.

The new California law says service to a phone can be halted only by “an authorized user.” But guess who is “authorized” besides the owner of the phone? Government at any level, right down to the dog catcher. The California statute incorporates a Public Utilities Commission Code section allowing a “government entity” – – broadly defined – – to get a court order requiring  the provider to cut service for a “reasonably necessary” period.

"Governmental entity" means every local government, including a city, 
county, a transit, joint powers, special, or other district, the state, 
and every agency, department, commission, board, bureau, or other 
political subdivision of the state, or any authorized agent thereof.

Based on this language, a bus supervisor or a fire inspector could conceivably be an authorized agent.

The circumstances allowing a government-ordered shutdown include probable cause that the phone is being used for an unlawful purpose. Or that without intervention, there is jeopardy to public health, safety or welfare.

Civil libertarians are going on the record with warnings that peaceful protesters might be shut down, or that law enforcement might misuse the remote kill. And it’s hardly ridiculous to wonder what public welfare threats involving individual cellphones government might perceive.

Sections authorizing government use will be the consequential portion of any Nevada kill switch law, given the industry’s already clearly-stated intention to provide anti-theft technology. Observers of Nevada’s 2015 session should watch closely. Or, maybe Nevada’s stated goal of luring tech will prevail, and the legislature will spend its time on other matters.

Not without a fight: Keeping the cops out of your iPhone

Law enforcement has its boxer shorts in a bunch following Apple’s announcement that iPhone 6 and iOS 8 devices will be protected from the probing eyes of police. Google, too, will complicate criminal investigations by changing its optional Android encryption feature to automatic.

Federal officials are “bracing for a confrontation with Silicon Valley” over encryption-by-default, reports the Wall Street Journal. A former FBI official called the new privacy feature “outrageous,” and likened it to an invitation for criminals to use the products.

Lame-duck Attorney General Eric Holder piled on this week, denouncing encryption and other privacy tools to the Global Alliance Conference Against Child Sexual Abuse. Holder wants tech companies to provide back doors into their products. To save the children, of course, whose exploitation at the hands of perverts is second only to their exploitation by the political class.

Criminals are certain to use encrypted smart phones, but so are millions of law-abiding citizens who’d like to send private messages to their physicians, their business prospects, and their mistresses. Some of them might snap a nude selfie or two. Everyone has perfectly legal secrets to keep.

Bravo to Apple and Google for their late, if grudging, arrival to the privacy party. In this arena, it should be noted that Apple is following, not leading. The privacy community celebrated the release earlier this year of the encrypted Blackphone, an Android adaptation with amped-up security and privacy. Blackphone hails from Switzerland. The company was located there for precisely the reasons outlined above, by an American technology entrepreneur who champions privacy.

Law enforcement’s fight for continued easy access to devices, cloud storage, and business-grade communication tools compromises the American economy, as well as personal privacy. Global companies have reservations about buying products that are open to U.S. government fishing expeditions. Just one more reason for American innovators to move offshore.

And this squabble ignores the valuable role of encryption in fighting cybercrime. Both personal and business data need to be properly encrypted. The fight to secure your snapshots is one thing. But guarding embedded systems that control water and power facilities is the next level.  As is the safety of intellectual property that’s the lifeblood of innovation. How about keeping criminals and terrorists out of the banking networks and the airport control towers?

Holder and company are apparently willing to expose a data pool far greater than the rate at which your next-door neighbor consumes dirty movies. (Evidence of which, incredibly enough, is sometimes seized by prosecutors grasping for dubious rape convictions, in order to establish predatory intent.)

Police have many avenues to truly relevant evidence. Third-party billing records and court orders requiring legitimate suspects to turn over their encryption keys offer a slower, but better-considered route. More generally, the surveillance deck is stacked heavily in law enforcement’s favor.

It’s time to nix the notion that our privacy is disposable because there’s a guy waiting behind every bush to molest children and grab college girls, or that everyone who carries more than one cell phone must be a drug dealer. Such an allusion to multiple phones was made by none other than Chief Justice John Roberts, even as he was about to author this summer’s Supreme Court decision requiring warrants for cell phone searches.

It’s also time to hit back at the rhetorical last refuge of scoundrels. “What if it was your daughter?” – a wretched query hurled regularly at privacy advocates, as if only those unscathed by crime should appreciate principles that protect all citizens from intrusions into their personal affairs.

These tired crime-fighting tropes need a rest, and so does the assumption that every technological tool should be an automatic enhancement to police power. These are stops along a path that leads to fear and mistrust of the police. That’s not a desirable outcome for any of us, as we saw in Ferguson, Missouri.