[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

FW: CRYPTO-GRAM, November 15, 2006,from Bruce Schneier



" The solution: Paper ballots, which can be verified 
by voters and recounted if necessary." 
--
Pete Klammer, P.E. / ACM(1970), IEEE, ICCP(CCP), NSPE(PE), NACSE(NSNE)
3200 Routt Street / Wheat Ridge, Colorado 80033-5452
(303)233-9485 / Fax:(303)274-6182 / Mailto:PKlammer@xxxxxxx
 "Idealism doesn't win every contest; but that's not what I choose it for."


-----Original Message-----
From: Bruce Schneier [mailto:schneier@xxxxxxxxxxxxxxx] 
Sent: Wednesday, November 15, 2006 1:49 AM
To: CRYPTO-GRAM-LIST@xxxxxxxxxxxxxxxxxxxx
Subject: CRYPTO-GRAM, November 15, 2006

                  CRYPTO-GRAM

               November 15, 2006

               by Bruce Schneier
                Founder and CTO
       Counterpane Internet Security, Inc.
            schneier@xxxxxxxxxxxxxxx
             http://www.schneier.com
            http://www.counterpane.com


A free monthly newsletter providing summaries, analyses, insights, and 
commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit 
<http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at 
<http://www.schneier.com/crypto-gram-0611.html>.  These same essays 
appear in the "Schneier on Security" blog: 
<http://www.schneier.com/blog>.  An RSS feed is available.


** *** ***** ******* *********** *************

In this issue:
      Voting Technology and Security
      More on Electronic Voting Machines
      The Inherent Inaccuracy of Voting
      The Need for Professional Election Officials
      Perceived Risk vs. Actual Risk
      Crypto-Gram Reprints
      Total Information Awareness Is Back
      Forge Your Own Boarding Pass
      News
      The Death of Ephemeral Conversation
      Airline Passenger Profiling for Profit
      Counterpane News
      Architecture and Security
      The Doghouse: Skylark Utilities
      Heathrow Tests Biometric ID
      Please Stop My Car
      Air Cargo Security
      Cheyenne Mountain Retired
      Comments from Readers

** *** ***** ******* *********** *************

      Voting Technology and Security



Last week in Florida's 13th Congressional district, the victory margin 
was only 386 votes out of 153,000. There'll be a mandatory lawyered-up 
recount, but it won't include the almost 18,000 votes that seem to have 
disappeared. The electronic voting machines didn't include them in their 
final tallies, and there's no backup to use for the recount. The 
district will pick a winner to send to Washington, but it won't be 
because they are sure the majority voted for him. Maybe the majority 
did, and maybe it didn't. There's no way to know.

Electronic voting machines represent a grave threat to fair and accurate 
elections, a threat that every American -- Republican, Democrat or 
independent -- should be concerned about. Because they're 
computer-based, the deliberate or accidental actions of a few can swing 
an entire election. The solution: Paper ballots, which can be verified 
by voters and recounted if necessary.

To understand the security of electronic voting machines, you first have 
to consider election security in general. The goal of any voting system 
is to capture the intent of each voter and collect them all into a final 
tally. In practice, this occurs through a series of transfer steps. When 
I voted last week, I transferred my intent onto a paper ballot, which 
was then transferred to a tabulation machine via an optical scan reader; 
at the end of the night, the individual machine tallies were transferred 
by election officials to a central facility and combined into a single 
result I saw on television.

All election problems are errors introduced at one of these steps, 
whether it's voter disenfranchisement, confusing ballots, broken 
machines or ballot stuffing. Even in normal operations, each step can 
introduce errors. Voting accuracy, therefore, is a matter of 1) 
minimizing the number of steps, and 2) increasing the reliability of 
each step.

Much of our election security is based on "security by competing 
interests." Every step, with the exception of voters completing their 
single anonymous ballots, is witnessed by someone from each major party; 
this ensures that any partisan shenanigans -- or even honest mistakes -- 
will be caught by the other observers. This system isn't perfect, but 
it's worked pretty well for a couple hundred years.

Electronic voting is like an iceberg; the real threats are below the 
waterline where you can't see them. Paperless electronic voting machines 
bypass that security process, allowing a small group of people -- or 
even a single hacker -- to affect an election. The problem is software 
-- programs that are hidden from view and cannot be verified by a team 
of Republican and Democrat election judges, programs that can 
drastically change the final tallies. And because all that's left at the 
end of the day are those electronic tallies, there's no way to verify 
the results or to perform a recount. Recounts are important.

This isn't theoretical. In the U.S., there have been hundreds of 
documented cases of electronic voting machines distorting the vote to 
the detriment of candidates from both political parties: machines losing 
votes, machines swapping the votes for candidates, machines registering 
more votes for a candidate than there were voters, machines not 
registering votes at all. I would like to believe these are all mistakes 
and not deliberate fraud, but the truth is that we can't tell the 
difference. And these are just the problems we've caught; it's almost 
certain that many more problems have escaped detection because no one 
was paying attention.

This is both new and terrifying. For the most part, and throughout most 
of history, election fraud on a massive scale has been hard; it requires 
very public actions or a highly corrupt government -- or both. But 
electronic voting is different: a lone hacker can affect an election. He 
can do his work secretly before the machines are shipped to the polling 
stations. He can affect an entire area's voting machines. And he can 
cover his tracks completely, writing code that deletes itself after the 
election.

And that assumes well-designed voting machines. The actual machines 
being sold by companies like Diebold, Sequoia Voting Systems and 
Election Systems & Software are much worse. The software is badly 
designed. Machines are "protected" by hotel minibar keys. Vote tallies 
are stored in easily changeable files. Machines can be infected with 
viruses. Some voting software runs on Microsoft Windows, with all the 
bugs and crashes and security vulnerabilities that introduces. The list 
of inadequate security practices goes on and on.

The voting machine companies counter that such attacks are impossible 
because the machines are never left unattended (they're not), the memory 
cards that hold the votes are carefully controlled (they're not), and 
everything is supervised (it isn't). Yes, they're lying, but they're 
also missing the point.

We shouldn't -- and don't -- have to accept voting machines that might 
someday be secure only if a long list of operational procedures are 
followed precisely. We need voting machines that are secure regardless 
of how they're programmed, handled and used, and that can be trusted 
even if they're sold by a partisan company, or a company with possible 
ties to Venezuela.

Sounds like an impossible task, but in reality, the solution is 
surprisingly easy. The trick is to use electronic voting machines as 
ballot-generating machines. Vote by whatever automatic touch-screen 
system you want: a machine that keeps no records or tallies of how 
people voted, but only generates a paper ballot. The voter can check it 
for accuracy, then process it with an optical-scan machine. The second 
machine provides the quick initial tally, while the paper ballot 
provides for recounts when necessary. And absentee and backup ballots 
can be counted the same way.

You can even do away with the electronic vote-generation machines 
entirely and hand-mark your ballots like we do in Minnesota. Or run a 
100% mail-in election like Oregon does. Again, paper ballots are the key.

Paper? Yes, paper. A stack of paper is harder to tamper with than a 
number in a computer's memory. Voters can see their vote on paper, 
regardless of what goes on inside the computer. And most important, 
everyone understands paper. We get into hassles over our cell phone 
bills and credit card mischarges, but when was the last time you had a 
problem with a $20 bill? We know how to count paper. Banks count it all 
the time. Both Canada and the U.K. count paper ballots with no problems, 
as do the Swiss. We can do it, too. In today's world of computer 
crashes, worms and hackers, a low-tech solution is the most secure.

Secure voting machines are just one component of a fair and honest 
election, but they're an increasingly important part. They're where a 
dedicated attacker can most effectively commit election fraud (and we 
know that changing the results can be worth millions). But we shouldn't 
forget other voter suppression tactics: telling people the wrong polling 
place or election date, taking registered voters off the voting rolls, 
having too few machines at polling places, or making it onerous for 
people to register. (Oddly enough, ineligible people voting isn't a 
problem in the U.S., despite political rhetoric to the contrary; every 
study shows their numbers to be so small as to be insignificant. And 
photo ID requirements actually cause more problems than they solve.)

Voting is as much a perception issue as it is a technological issue. 
It's not enough for the result to be mathematically accurate; every 
citizen must also be confident that it is correct. Around the world, 
people protest or riot after an election not when their candidate loses, 
but when they think their candidate lost unfairly. It is vital for a 
democracy that an election both accurately determine the winner and 
adequately convince the loser. In the U.S., we're losing the perception 
battle.

The current crop of electronic voting machines fail on both counts. The 
results from Florida's 13th Congressional district are neither accurate 
nor convincing. As a democracy, we deserve better. We need to refuse to 
vote on electronic voting machines without a voter-verifiable paper 
ballot, and to continue to pressure our legislatures to implement voting 
technology that works.

This essay originally appeared on Forbes.com.
http://www.forbes.com/home/security/2006/11/10/voting-fraud-security-tech-se
curity-cz_bs_1113security.html

http://www.schneier.com/essay-068.html
http://www.schneier.com/blog/archives/2004/11/the_problem_wit.html
http://www.votingintegrity.org/archive/news/e-voting.html
http://www.verifiedvoting.org/article.php?id=997
http://www.ecotalk.org/VotingMachineErrors.htm
http://evote-mass.org/pipermail/evote-discussion_evote-mass.org/2005-January
/000080.html 
or http://tinyurl.com/yhvb2a
http://avirubin.com/vote/analysis/index.html
http://www.freedom-to-tinker.com/?p=1080
http://www.freedom-to-tinker.com/?p=1081
http://www.freedom-to-tinker.com/?p=1064
http://www.freedom-to-tinker.com/?p=1084
http://www.bbvforums.org/cgi-bin/forums/board-auth.cgi?file=/1954/15595.html

or http://tinyurl.com/9ywcn
http://itpolicy.princeton.edu/voting
http://www.ss.ca.gov/elections/voting_systems/security_analysis_of_the_diebo
ld_accubasic_interpreter.pdf 
or http://tinyurl.com/eqpbd
http://www.blackboxvoting.org
http://www.brennancenter.org/dynamic/subpages/download_file_38150.pdf
http://avirubin.com/judge2.html
http://avirubin.com/judge.html
http://www.usatoday.com/news/washington/2006-10-29-voting-systems-probe_x.ht
m 
or http://tinyurl.com/ylnba6

How to Steal an Election:
http://arstechnica.com/articles/culture/evoting.ars

Florida 13:
http://www.heraldtribune.com/apps/pbcs.dll/article?AID=/20061111/NEWS/611110
643 
or http://tinyurl.com/ygo73l
http://www.heraldtribune.com/apps/pbcs.dll/article?Date=20061108&Category=NE
WS&ArtNo=611080506 
or http://tinyurl.com/yahvve
http://www.heraldtribune.com/apps/pbcs.dll/article?AID=/20061109/NEWS/611090
343 
or http://tinyurl.com/yhkwdt
http://www.nytimes.com/2006/11/10/us/politics/10florida.html
http://www.lipsio.com/SarasotaFloridaPrecinct22IncidentPhotos/

Value of stolen elections:
http://www.schneier.com/essay-046.html

Perception:
http://www.npr.org/templates/story/story.php?storyId=6449790

Voter suppression:
http://blackprof.com/stealingd.html

ID requirements:
http://www.lwvwi.org/cms/images/stories/PDFs/VR%20Photo%20ID.pdf
http://www.demos.org/page337.cfm

Foxtrot cartoon:
http://www.gocomics.com/foxtrot/2006/10/29

Avi Rubin wrote a good essay on voting for "Forbes" as well.
http://www.forbes.com/home/free_forbes/2006/0904/040.html


** *** ***** ******* *********** *************

      More on Electronic Voting Machines



Florida 13 is turning out to be a bigger problem than I described:

"The Democrat, Christine Jennings, lost to her Republican opponent, Vern 
Buchanan, by just 373 votes out of a total 237,861 cast --one of the 
closest House races in the nation. More than 18,000 voters in Sarasota 
County, or 13 percent of those who went to the polls Tuesday, did not 
seem to vote in the Congressional race when they cast ballots, a 
discrepancy that Kathy Dent, the county elections supervisor, said she 
could not explain.

"In comparison, only 2 percent of voters in one neighboring county 
within the same House district and 5 percent in another skipped the 
Congressional race, according to The Herald-Tribune of Sarasota. And 
many of those who did not seem to cast a vote in the House race did vote 
in more obscure races, like for the hospital board."

And the absentee ballots collected for the same race show only a 2.5% 
difference in the number of voters that voted for candidates in other 
races but not for Congress.

There'll be a recount, and with that close a margin it's pretty random 
who will eventually win.  But because so many votes were not recorded -- 
and I don't see how anyone who has any understanding of statistics can 
look at this data and not conclude that votes were not recorded -- we'll 
never know who should really win this district.

In Pennsylvania, the Republican State Committee is asking the Secretary 
of State to impound voting machines because of potential voting errors. 
  According to KDKA:

"Pennsylvania GOP officials claimed there were reports that some 
machines were changing Republican votes to Democratic votes. They asked 
the state to investigate and said they were not ruling out a legal 
challenge.

"According to Santorum's camp, people are voting for Santorum, but the 
vote either registered as invalid or a vote for Casey."

RedState.com describes some of the problems:

"RedState is getting widespread reports of an electoral nightmare 
shaping up in Pennsylvania with certain types of electronic voting machines.

"In some counties, machines are crashing. In other counties, we have 
enough reports to treat as credible that fact that some Rendell votes 
are being tabulated by the machines for Swann and vice versa. The same 
is happening with Santorum and Casey. Reports have been filed with the 
Pennsylvania Secretary of State, but nothing has happened."

I'm happy to see a Republican at the receiving end of the problems.

Actually, that's not true.  I'm not happy to see anyone at the receiving 
end of voting problems.  But I am sick and tired of this being perceived 
as a partisan issue, and I hope some high-profile Republican losses that 
might be attributed to electronic voting-machine malfunctions (or even 
fraud) will change that perception.  This is a serious problem that 
affects everyone, and it is in everyone's interest to fix it.

FL-13 was the big voting-machine disaster, but there were other 
electronic voting-machine problems reported.  EFF wrote:  "The types of 
machine problems reported to EFF volunteers were wide-ranging in both 
size and scope. Polls opened late for machine-related reasons in polling 
places throughout the country, including Ohio, Florida, Georgia, 
Virginia, Utah, Indiana, Illinois, Tennessee, and California. In Broward 
County, Florida, voting machines failed to start up at one polling 
place, leaving some citizens unable to cast votes for hours. EFF and the 
Election Protection Coalition sought to keep the polling place open late 
to accommodate voters frustrated by the delays, but the officials 
refused. In Utah County, Utah, more than 100 precincts opened one to two 
hours late on Tuesday due to problems with machines. Both county and 
state election officials refused to keep polling stations open longer to 
make up for the lost time, and a judge also turned down a voter's plea 
for extended hours brought by EFF."

And there's an election for mayor, where one of the candidates received 
zero votes -- even though that candidate is sure he voted for himself.

ComputerWorld is also reporting problems across the country, as is "The 
New York Times".  Avi Rubin, whose writings on electronic voting 
security are always worth reading, writes about a problem he witnessed 
in Maryland:

"The voter had made his selections and pressed the "cast ballot" button 
on the machine. The machine spit out his smartcard, as it is supposed to 
do, but his summary screen remained, and it did not appear that his vote 
had been cast. So, he pushed the smartcard back in, and it came out 
saying that he had already voted. But, he was still in the screen that 
showed he was in the process of voting. The voter then pressed the "cast 
ballot" again, and an error message appeared on the screen that said 
that he needs to call a judge for assistance. The voter was very 
patient, but was clearly taking this very seriously, as one would 
expect. After discussing the details about what happened with him very 
carefully, I believed that there was a glitch with his machine, and that 
it was in an unexpected state after it spit out the smartcard. The 
question we had to figure out was whether or not his vote had been 
recorded. The machine said that there had been 145 votes cast. So, I 
suggested that we count the voter authority cards in the envelope 
attached to the machine. Since we were grouping them into bundles of 25 
throughout the day, that was pretty easy, and we found that there were 
146 authority cards. So, this meant that either his vote had not been 
counted, or that the count was off for some other reason. Considering 
that the count on that machine had been perfect all day, I thought that 
the most likely thing is that this glitch had caused his vote not to 
count. Unfortunately, because while this was going on, all the other 
voters had left, other election judges had taken down and put away the 
e-poll books, and we had no way to encode a smartcard for him. We were 
left with the possibility of having the voter vote on a provisional 
ballot, which is what he did. He was gracious, and understood our 
predicament.

"The thing is, that I don't know for sure now if this voter's vote will 
be counted once or twice (or not at all if the board of election rejects 
his provisional ballot). In fact, the purpose of counting the voter 
authority cards is to check the counts on the machines hourly. What we 
had done was to use the number of cards to conclude something about 
whether a particular voter had voted, and that is not information that 
these cards can provide. Unfortunately, I believe there are an 
unimaginable number of problems that could crop up with these machines 
where we would not know for sure if a voter's vote had been recorded, 
and the machines provide no way to check on such questions. If we had 
paper ballots that were counted by optical scanners, this kind of 
situation could never occur."

How many hundreds of these stories do we need before we conclude that 
electronic voting machines aren't accurate enough for elections?

On the plus side, the FL-13 problems have convinced some previous 
naysayers in that district:  "Supervisor of Elections Kathy Dent now 
says she will comply with voters who want a new voting system -- one 
that produces a paper trail....  Her announcement Friday marks a 
reversal for the elections supervisor, who had promoted and adamantly 
defended the touch-screen system the county purchased for $4.5 million 
in 2001."

One of the dumber comments I hear about electronic voting goes something 
like this: "If we can secure multi-million-dollar financial 
transactions, we should be able to secure voting."  Most financial 
security comes through audit: names are attached to every transaction, 
and transactions can be unwound if there are problems.  Voting requires 
an anonymous ballot, which means that most of our anti-fraud systems 
from the financial world don't apply to voting.  (I first explained this 
back in 2001.)

In Minnesota, we use paper ballots counted by optical scanners, and we 
have some of the most well-run elections in the country.  To anyone 
reading this who needs to buy new election equipment, this is what to buy.

On the other hand, I am increasingly of the opinion that an all mail-in 
election -- like Oregon has -- is the right answer.  Yes, there are 
authentication issues with mail-in ballots, but these are issues we have 
to solve anyway, as long as we allow absentee ballots.  And yes, there 
are vote-buying issues, but almost everyone considers them to be 
secondary.  The combined benefits of 1) a paper ballot, 2) no worries 
about long lines due to malfunctioning or insufficient machines, 3) 
increased voter turnout, and 4) a dampening of the last-minute campaign 
frenzy make Oregon's election process very appealing.

FL-13:
http://www.nytimes.com/2006/11/10/us/politics/10florida.html
http://www.srqelections.com/results/gen2006sum.htm
http://www.srqelections.com/results/gen2006pct.htm
http://www.heraldtribune.com/apps/pbcs.dll/article?AID=/20061111/NEWS/611110
643 
or http://tinyurl.com/ygo73l

Convincing naysayers:
http://www.heraldtribune.com/apps/pbcs.dll/article?AID=/20061111/NEWS/611110
530 
or http://tinyurl.com/yhr6uv

Pennsylvania:
http://kdka.com/topstories/local_story_311194635.html
http://www.redstate.com/stories/elections/2006/breaking_massive_meltdown_in_
pennsylvanian 
or http://tinyurl.com/yjrb68

http://www.eff.org/news/archives/2006_11.php#004991
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004849&source=NLT_SEC&nlid=38 
or http://tinyurl.com/yf652b
http://www.nytimes.com/2006/11/08/us/politics/08blogs.html
http://arstechnica.com/news.ars/post/20061101-8131.html
http://www.bradblog.com/?p=3714
http://www.bradblog.com/?p=3719

E-voting state by state:
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004591 
or http://tinyurl.com/yhg3bw

E-voting vendors:
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004583 
or http://tinyurl.com/y6sxuf

HBO's "Hacking Democracy" documentary:
http://www.hbo.com/docs/programs/hackingdemocracy/index.html
http://www.nytimes.com/2006/11/02/arts/television/02hack.html
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004584 
or http://tinyurl.com/yhj8ob

Avi Rubin on voting:
http://avirubin.com/vote/
http://avi-rubin.blogspot.com/2006/11/my-day-at-polls-maryland-general.html 
or http://tinyurl.com/yjze8k

David Wagner and Ed Felten design a better voting machine.
http://www.wired.com/news/politics/evote/1,71957-0.html

Mayoral election:
http://abcnews.go.com/US/wireStory?id=2646802&CMP=OTC-RSSFeeds0312

My previous writings on electronic voting, as far back as 2000:
http://www.schneier.com/essay-068.html
http://www.schneier.com/essay-067.html
http://www.schneier.com/crypto-gram-0312.html#9
http://www.schneier.com/essay-101.html
http://www.schneier.com/crypto-gram-0012.html#1

Voting vs. e-commerce:
http://www.schneier.com/crypto-gram-0102.html#10


** *** ***** ******* *********** *************

      The Inherent Inaccuracy of Voting



In a "New York Times" op-ed, New York University sociology professor 
Dalton Conley points out that vote counting is inherently inaccurate:

"The rub in these cases is that we could count and recount, we could 
examine every ballot four times over and we'd get -- you guessed it -- 
four different results. That's the nature of large numbers -- there is 
inherent measurement error. We'd like to think that there is a "true" 
answer out there, even if that answer is decided by a single vote. We so 
desire the certainty of thinking that there is an objective truth in 
elections and that a fair process will reveal it.

"But even in an absolutely clean recount, there is not always a sure 
answer. Ever count out a large jar of pennies? And then do it again? And 
then have a friend do it? Do you always converge on a single number? Or 
do you usually just average the various results you come to? If you are 
like me, you probably settle on an average. The underlying notion is 
that each election, like those recounts of the penny jar, is more like a 
poll of some underlying voting population."

He's right, but it's more complicated than that.

There are two basic types of voting errors: random errors and systemic 
errors. Random errors are just that, random.  Votes intended for A that 
mistakenly go to B are just as likely as votes intended for B that 
mistakenly go to A.  This is why, traditionally, recounts in close 
elections are unlikely to change things.  The recount will find the few 
percent of the errors in each direction, and they'll cancel each other 
out.  But in a very close election, a careful recount will yield a more 
accurate -- but almost certainly not perfectly accurate -- result.

Systemic errors are more important, because they will cause votes 
intended for A to go to B at a different rate than the reverse.  Those 
can make a dramatic difference in an election, because they can easily 
shift thousands of votes from A to B without any counterbalancing shift 
from B to A.  These errors can either be a particular problem in the 
system -- a badly designed ballot, for example -- or a random error that 
only occurs in precincts where A has more supporters than B.

Here's where the problems of electronic voting machines become critical: 
they're more likely to be systemic problems.  Vote flipping, for 
example, seems to generally affect one candidate more than another. 
Even individual machine failures are going to affect supporters of one 
candidate more than another, depending on where the particular machine 
is.  And if there are no paper ballots to fall back on, no recount can 
undo these problems.

Conley proposes to nullify any election where the margin of victory is 
less than 1%, and have everyone vote again.  I agree, but I think his 
margin is too large.  In the Virginia Senate race, Allen was right not 
to demand a recount.  Even though his 7,800-vote loss was only 0.33%, in 
the absence of systemic flaws it is unlikely that a recount would change 
things.  I think an automatic revote if the margin of victory is less 
than 0.1% makes more sense.

Conley again:

"Yes, it costs more to run an election twice, but keep in mind that many 
places already use runoffs when the leading candidate fails to cross a 
particular threshold. If we are willing to go through all that trouble, 
why not do the same for certainty in an election that teeters on a 
razor's edge? One counter-argument is that such a plan merely shifts the 
realm of debate and uncertainty to a new threshold -- the 99 percent 
threshold. However, candidates who lose by the margin of error have a 
lot less rhetorical power to argue for redress than those for whom an 
actual majority is only a few votes away.

"It may make us existentially uncomfortable to admit that random chance 
and sampling error play a role in our governance decisions. But in 
reality, by requiring a margin of victory greater than one, seemingly 
arbitrary vote, we would build in a buffer to democracy, one that offers 
us a more bedrock sense of security that the 'winner' really did win."

This is a good idea, but it doesn't address the systemic problems with 
voting. If there are systemic problems, there should be another election 
day limited to only those precincts that had the problem and only those 
people who can prove they voted -- or tried to vote and failed -- during 
the first election day.  (Although I could be persuaded that another 
re-voting protocol would make more sense.)

But most importantly, we need better voting machines and better voting 
procedures.

http://www.nytimes.com/2006/11/06/opinion/06conley.html

Vote flipping:
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004858&source=NLT_SEC&nlid=38 
or http://tinyurl.com/yfdhk6


** *** ***** ******* *********** *************

      The Need for Professional Election Officials



In the U.S., elections are run by an army of hundreds of thousands of 
volunteers.  These are both Republicans and Democrats, and the idea is 
that the one group watches the other: security by competing interests. 
But at the top are state-elected or -appointed officials, and many 
election shenanigans in the past several years have been perpetrated by 
them.

In yet another "New York Times" op-ed, Loyola Law School professor 
Richard Hansen argues" for professional, non-partisan election 
officials:  "The United States should join the rest of the world's 
advanced democracies and put nonpartisan professionals in charge. We 
need officials whose ultimate allegiance is to the fairness, integrity 
and professionalism of the election process, not to helping one party or 
the other gain political advantage. We don't need disputes like the 
current one in Florida being resolved by party hacks."

And:  "To improve the chances that states will choose an independent and 
competent chief elections officer, states should enact laws making that 
officer a long-term gubernatorial appointee who takes office only upon 
confirmation by a 75 percent vote of the legislature -- a supermajority 
requirement that would ensure that a candidate has true bipartisan 
support. Nonpartisanship in election administration is no dream. It is 
how Canada and Australia run their national elections."

To me, this is easier said than done.  Where are these hundreds of 
thousands of disinterested election officials going to come from?  And 
how do we ensure that they're disinterested and fair, and not just 
partisans in disguise?  I actually like security by competing interests.

But I do like his idea of a supermajority-confirmed chief elections 
officer for each state.  And at least he's starting the debate about 
better election procedures in the U.S.


http://www.nytimes.com/2006/11/11/opinion/11hasen.html


** *** ***** ******* *********** *************

      Perceived Risk vs. Actual Risk



I've written repeatedly about the difference between perceived and 
actual risk, and how it explains many seemingly perverse security 
trade-offs.  Here's a "Los Angeles Times" op-ed that does the same.  The 
author is Daniel Gilbert, psychology professor at Harvard.  (I just 
recently finished his book "Stumbling on Happiness," which is not a 
self-help book but instead about how the brain works.  Strongly 
recommended.)

The op-ed is about the public's reaction to the risks of global warming 
and terrorism, but the points he makes are much more general.  He gives 
four reasons why some risks are perceived to be more or less serious 
than they actually are:

1. We over-react to intentional actions, and under-react to accidents, 
abstract events, and natural phenomena.  "That's why we worry more about 
anthrax (with an annual death toll of roughly zero) than influenza (with 
an annual death toll of a quarter-million to a half-million people). 
Influenza is a natural accident, anthrax is an intentional action, and 
the smallest action captures our attention in a way that the largest 
accident doesn't. If two airplanes had been hit by lightning and crashed 
into a New York skyscraper, few of us would be able to name the date on 
which it happened."

2. We over-react to things that offend our morals.  "When people feel 
insulted or disgusted, they generally do something about it, such as 
whacking each other over the head, or voting. Moral emotions are the 
brain's call to action."

He doesn't say it, but it's reasonable to assume that we under-react to 
things that don't.

3.  We over-react to immediate threats and under-react to long-term 
threats.  "The brain is a beautifully engineered get-out-of-the-way 
machine that constantly scans the environment for things out of whose 
way it should right now get. That's what brains did for several hundred 
million years -- and then, just a few million years ago, the mammalian 
brain learned a new trick: to predict the timing and location of dangers 
before they actually happened.  Our ability to duck that which is not 
yet coming is one of the brain's most stunning innovations, and we 
wouldn't have dental floss or 401(k) plans without it. But this 
innovation is in the early stages of development. The application that 
allows us to respond to visible baseballs is ancient and reliable, but 
the add-on utility that allows us to respond to threats that loom in an 
unseen future is still in beta testing."

4.  We under-react to changes that occur slowly and over time.  "The 
human brain is exquisitely sensitive to changes in light, sound, 
temperature, pressure, size, weight and just about everything else. But 
if the rate of change is slow enough, the change will go undetected. If 
the low hum of a refrigerator were to increase in pitch over the course 
of several weeks, the appliance could be singing soprano by the end of 
the month and no one would be the wiser."

It's interesting to compare this to what I wrote in "Beyond Fear" (pages 
26-27) about perceived vs. actual risk:

" * People exaggerate spectacular but rare risks and downplay common 
risks. They worry more about earthquakes than they do about slipping on 
the bathroom floor, even though the latter kills far more people than 
the former. Similarly, terrorism causes far more anxiety than common 
street crime, even though the latter claims many more lives. Many people 
believe that their children are at risk of being given poisoned candy by 
strangers at Halloween, even though there has been no documented case of 
this ever happening.

" *People have trouble estimating risks for anything not exactly like 
their normal situation. Americans worry more about the risk of mugging 
in a foreign city, no matter how much safer it might be than where they 
live back home. Europeans routinely perceive the U.S. as being full of 
guns. Men regularly underestimate how risky a situation might be for an 
unaccompanied woman. The risks of computer crime are generally believed 
to be greater than they are, because computers are relatively new and 
the risks are unfamiliar. Middle-class Americans can be particularly 
naive and complacent; their lives are incredibly secure most of the 
time, so their instincts about the risks of many situations have been 
dulled.

" * Personified risks are perceived to be greater than anonymous risks. 
Joseph Stalin said, 'A single death is a tragedy, a million deaths is a 
statistic.' He was right; large numbers have a way of blending into each 
other. The final death toll from 9/11 was less than half of the initial 
estimates, but that didn't make people feel less at risk. People gloss 
over statistics of automobile deaths, but when the press writes page 
after page about nine people trapped in a mine -- complete with 
human-interest stories about their lives and families -- suddenly 
everyone starts paying attention to the dangers with which miners have 
contended for centuries. Osama bin Laden represents the face of Al 
Qaeda, and has served as the personification of the terrorist threat. 
Even if he were dead, it would serve the interests of some politicians 
to keep him "alive" for his effect on public opinion.

" * People underestimate risks they willingly take and overestimate 
risks in situations they can't control. When people voluntarily take a 
risk, they tend to underestimate it. When they have no choice but to 
take the risk, they tend to overestimate it. Terrorists are scary 
because they attack arbitrarily, and from nowhere. Commercial airplanes 
are perceived as riskier than automobiles, because the controls are in 
someone else's hands -- even though they're much safer per passenger 
mile. Similarly, people overestimate even more those risks that they 
can't control but think they, or someone, should. People worry about 
airplane crashes not because we can't stop them, but because we think as 
a society we should be capable of stopping them (even if that is not 
really the case). While we can't really prevent criminals like the two 
snipers who terrorized the Washington, DC, area in the fall of 2002 from 
killing, most people think we should be able to.

"Last, people overestimate risks that are being talked about and remain 
an object of public scrutiny. News, by definition, is about anomalies. 
Endless numbers of automobile crashes hardly make news like one airplane 
crash does. The West Nile virus outbreak in 2002 killed very few people, 
but it worried many more because it was in the news day after day. AIDS 
kills about 3 million people per year worldwide -- about three times as 
many people each day as died in the terrorist attacks of 9/11. If a 
lunatic goes back to the office after being fired and kills his boss and 
two coworkers, it's national news for days. If the same lunatic shoots 
his ex-wife and two kids instead, it's local news...maybe not even the 
lead story."

http://www.latimes.com/news/opinion/sunday/commentary/la-op-gilbert2jul02,0,
4254536.story 
or http://tinyurl.com/ydw3up


** *** ***** ******* *********** *************

      Crypto-Gram Reprints



The Security of RFID Passports:
http://www.schneier.com/crypto-gram-0511.html#1

Liabilities and Software Vulnerabilities:
http://www.schneier.com/crypto-gram-0511.html#2

The Zotob Worm:
http://www.schneier.com/crypto-gram-0511.html#12

Why Election Technology is Hard:
http://www.schneier.com/crypto-gram-0411.html#1

Electronic Voting Machines:
http://www.schneier.com/crypto-gram-0411.html#2

The Security of Checks and Balances
http://www.schneier.com/crypto-gram-0411.html#10

Security Information Management Systems (SIMS):
http://www.schneier.com/crypto-gram-0411.html#12

Technology and Counterterrorism:
http://www.schneier.com/crypto-gram-0411.html#13

Airplane Hackers:
http://www.schneier.com/crypto-gram-0311.html#1

The Trojan Defense
http://www.schneier.com/crypto-gram-0311.html#8

Full Disclosure:
http://www.schneier.com/crypto-gram-0111.html#1

Why Digital Signatures are Not Signatures
http://www.schneier.com/crypto-gram-0011.html#1

Programming Satan's Computer: Why Computers Are Insecure
http://www.schneier.com/crypto-gram-9911.html#WhyComputersareInsecure or 
http://tinyurl.com/7ldrl

Elliptic Curve Public-Key Cryptography
http://www.schneier.com/crypto-gram-9911.html#EllipticCurvePublic-KeyCryptog
raphy 
or http://tinyurl.com/a2low

The Future of Fraud: Three reasons why electronic commerce is different
http://www.schneier.com/crypto-gram-9811.html#commerce

Software Copy Protection: Why copy protection does not work
http://www.schneier.com/crypto-gram-9811.html#copy


Crypto-Gram is currently in its ninth year of publication.  Back issues 
cover a variety of security-related topics, and can all be found on 
<http://www.schneier.com/crypto-gram-back.html>.  These are a selection 
of articles that appeared in this calendar month in other years.


** *** ***** ******* *********** *************

      Total Information Awareness Is Back



Remember Total Information Awareness (TIA), the massive database on 
everyone that was supposed to find terrorists?  The public found it so 
abhorrent, and objected so forcefully, that Congress killed funding for 
the program in September 2003.

None of us thought that meant the end of TIA, only that it would turn 
into a classified program and be renamed. Well, the program is now 
called Tangram, and it is classified.

The "National Journal" writes:

"The government's top intelligence agency is building a computerized 
system to search very large stores of information for patterns of 
activity that look like terrorist planning. The system, which is run by 
the Office of the Director of National Intelligence, is in the early 
research phases and is being tested, in part, with government 
intelligence that may contain information on U.S. citizens and other 
people inside the country.

"It encompasses existing profiling and detection systems, including 
those that create 'suspicion scores' for suspected terrorists by 
analyzing very large databases of government intelligence, as well as 
records of individuals' private communications, financial transactions, 
and other everyday activities."

The information about Tangram comes from a government document looking 
for contractors to help design and build the system.

DefenseTech writes:  "The document, which is a description of the 
Tangram program for potential contractors, describes other, existing 
profiling and detection systems that haven't moved beyond so-called 
'guilt-by-association models,' which link suspected terrorists to 
potential associates, but apparently don't tell analysts much about why 
those links are significant. Tangram wants to improve upon these 
methods, as well as investigate the effectiveness of other detection 
links such as 'collective inferencing,' which attempt to create 
suspicion scores of entire networks of people simultaneously."

Data mining for terrorists has always been a dumb idea. And the 
existence of Tangram illustrates the problem with Congress trying to 
stop a program by killing its funding; it just comes back under a 
different name.

http://nationaljournal.com/about/njweekly/stories/2006/1020nj3.htm
http://www.fbo.gov/spg/USAF/AFMC/AFRLRRS/Reference-Number-BAA-06-04-IFKA/Syn
opsisP.html 
or http://tinyurl.com/y5sg9l
http://www.defensetech.org/archives/002875.html

My previous writings on data mining:
http://www.schneier.com/blog/archives/2006/03/data_mining_for.html
http://www.schneier.com/blog/archives/2006/05/the_problems_wi.html


** *** ***** ******* *********** *************

      Forge Your Own Boarding Pass



Last week Christopher Soghoian created a Fake Boarding Pass Generator 
website, allowing anyone to create a fake Northwest Airlines boarding 
pass: any name, airport, date, flight.  This action got him visited by 
the FBI, who later came back, smashed open his front door, and seized 
his computers and other belongings. It resulted in calls for his arrest 
-- the most visible by Rep. Edward Markey (D-Massachusetts) -- who has 
since recanted. And it's gotten him more publicity than he ever dreamed of.

All for demonstrating a known and obvious vulnerability in airport 
security involving boarding passes and IDs.

This vulnerability is nothing new. There was an article on CSOonline 
from February 2006. There was an article on Slate from February 2005. 
Sen. Chuck Schumer spoke about it in 2005 as well. I wrote about it in 
the August 2003 issue of Crypto-Gram. It's possible I was the first 
person to publish it, but I certainly wasn't the first person to think 
of it.

It's kind of obvious, really. If you can make a fake boarding pass, you 
can get through airport security with it. Big deal; we know.

You can also use a fake boarding pass to fly on someone else's ticket. 
The trick is to have two boarding passes: one legitimate, in the name 
the reservation is under, and another phony one that matches the name on 
your photo ID. Use the fake boarding pass in your name to get through 
airport security, and the real ticket in someone else's name to board 
the plane.

This means that a terrorist on the no-fly list can get on a plane: He 
buys a ticket in someone else's name, perhaps using a stolen credit 
card, and uses his own photo ID and a fake ticket to get through airport 
security. Since the ticket is in an innocent's name, it won't raise a 
flag on the no-fly list.

You can also use a fake boarding pass instead of your real one if you 
have the "SSSS" mark and want to avoid secondary screening, or if you 
don't have a ticket but want to get into the gate area.

Historically, forging a boarding pass was difficult. It required special 
paper and equipment. But since Alaska Airlines started the trend in 
1999, most airlines now allow you to print your boarding pass using your 
home computer and bring it with you to the airport. This program was 
temporarily suspended after 9/11, but was quickly brought back because 
of pressure from the airlines. People who print the boarding passes at 
home can go directly to airport security, and that means fewer airline 
agents are required.

Airline websites generate boarding passes as graphics files, which means 
anyone with a little bit of skill can modify them in a program like 
Photoshop. All Soghoian's website did was automate the process with a 
single airline's boarding passes.

Soghoian claims that he wanted to demonstrate the vulnerability. You 
could argue that he went about it in a stupid way, but I don't think 
what he did is substantively worse than what I wrote in 2003. Or what 
Schumer described in 2005. Why is it that the person who demonstrates 
the vulnerability is vilified while the person who describes it is 
ignored? Or, even worse, the organization that causes it is ignored? Why 
are we shooting the messenger instead of discussing the problem?

As I wrote in 2005: "The vulnerability is obvious, but the general 
concepts are subtle. There are three things to authenticate: the 
identity of the traveler, the boarding pass and the computer record. 
Think of them as three points on the triangle. Under the current system, 
the boarding pass is compared to the traveler's identity document, and 
then the boarding pass is compared with the computer record. But because 
the identity document is never compared with the computer record -- the 
third leg of the triangle -- it's possible to create two different 
boarding passes and have no one notice. That's why the attack works."

The way to fix it is equally obvious: Verify the accuracy of the 
boarding passes at the security checkpoints. If passengers had to scan 
their boarding passes as they went through screening, the computer could 
verify that the boarding pass already matched to the photo ID also 
matched the data in the computer. Close the authentication triangle and 
the vulnerability disappears.

But before we start spending time and money and Transportation Security 
Administration agents, let's be honest with ourselves: The photo ID 
requirement is no more than security theater. Its only security purpose 
is to check names against the no-fly list, which would still be a joke 
even if it weren't so easy to circumvent. Identification is not a useful 
security measure here.

Interestingly enough, while the photo ID requirement is presented as an 
antiterrorism security measure, it is really an airline-business 
security measure. It was first implemented after the explosion of TWA 
Flight 800 over the Atlantic in 1996. The government originally thought 
a terrorist bomb was responsible, but the explosion was later shown to 
be an accident.

Unlike every other airplane security measure -- including reinforcing 
cockpit doors, which could have prevented 9/11 -- the airlines didn't 
resist this one, because it solved a business problem: the resale of 
non-refundable tickets. Before the photo ID requirement, these tickets 
were regularly advertised in classified pages: "Round trip, New York to 
Los Angeles, 11/21-30, male, $100." Since the airlines never checked 
IDs, anyone of the correct gender could use the ticket. Airlines hated 
that, and tried repeatedly to shut that market down. In 1996, the 
airlines were finally able to solve that problem and blame it on the FAA 
and terrorism.

So business is why we have the photo ID requirement in the first place, 
and business is why it's so easy to circumvent it. Instead of going 
after someone who demonstrates an obvious flaw that is already public, 
let's focus on the organizations that are actually responsible for this 
security failure and have failed to do anything about it for all these 
years. Where's the TSA's response to all this?

The problem is real, and the Department of Homeland Security and TSA 
should either fix the security or scrap the system. What we've got now 
is the worst security system of all: one that annoys everyone who is 
innocent while failing to catch the guilty.

This is my 30th essay for Wired.com:
http://www.wired.com/news/columns/0,72045-0.html

News:
http://j0hn4d4m5.bravehost.com
http://slightparanoia.blogspot.com/2006/10/post-fbi-visit.html
http://slightparanoia.blogspot.com/2006/10/fbi-visit-2.html
http://blog.wired.com/27bstroke6/2006/10/congressman_ed_.html
http://markey.house.gov/index.php?option=content&task=view&id=2336&Itemid=12
5 
or http://tinyurl.com/ymjkxa
http://blog.wired.com/27bstroke6/2006/10/boarding_pass_g.html

Older mentions of the vulnerability:
http://www.csoonline.com/read/020106/caveat021706.html
http://www.slate.com/id/2113157/fr/rss/
http://www.senate.gov/~schumer/SchumerWebsite/pressroom/press_releases/2005/
PR4123.aviationsecurity021305.html 
or http://tinyurl.com/yzoon6
http://www.schneier.com/crypto-gram-0308.html#6

No-fly list:
http://www.schneier.com/blog/archives/2005/12/30000_people_mi.html
http://www.schneier.com/blog/archives/2005/09/secure_flight_n_1.html
http://www.schneier.com/blog/archives/2006/10/nofly_list.html
http://www.schneier.com/blog/archives/2005/08/infants_on_the.html


** *** ***** ******* *********** *************

      News



This article argues that most of the $44 billion spent in the U.S. on 
bioterrorism defense has been wasted.
http://www.newscientist.com/channel/opinion/mg19225725.000

Targeted Trojan horses are the future of malware:
http://news.com.com/The+future+of+malware+Trojan+horses/2100-7349_3-6125453.
html 
or http://tinyurl.com/w8hx7

FixAVote.com: a good hoax.
http://www.fixavote.com/
http://www.infoworld.com/article/06/10/26/HNfixelections_1.html

Interview with a pickpocket expert:
http://www.kiplinger.com/personalfinance/magazine/archives/2006/11/mystory.h
tml 
or http://tinyurl.com/y3n2ap

Swiss police considering using Trojans for VoIP tapping:
http://www.pcpro.co.uk/news/95394/swiss-look-to-trojan-code-for-voip-tapping
.html 
or http://tinyurl.com/ygq43m

Lousy home security installation.  (Yes, it's an advertisement.  But 
there are still important security lessons in the blog post.)
http://providentsecurity.typepad.com/community_security_the_pr/2006/10/crimi
nal_instal.html 
or http://tinyurl.com/ygve7r

I don't think I've ever read anyone talking about class issues as they 
relate to security before.
http://redtape.msnbc.com/2006/10/doublestandards.html

This interesting article in "The New York Times" illustrates that the 
problem of agricultural safety and security mirrors the security issues 
in computer networks, especially with the monoculture in operating 
systems and network protocols.
http://www.nytimes.com/2006/10/15/magazine/15wwln_lede.html
http://www.schneier.com/blog/archives/2006/08/security_and_mo.html

Interesting speculation: "Warning Signs for Tomorrow."
http://www.aleph.se/andart/archives/2006/10/warning_signs_for_tomorrow.html 
or http://tinyurl.com/yylq69

Good essay on perceived vs. actual risk.  The hook is Mayor Daley of 
Chicago demanding a no-fly-zone over Chicago in the wake of the New York 
City airplane crash.
http://www.aopa.org/whatsnew/newsitems/2006/061013enough.html

And, on the same topic, why it doesn't make sense to ban small aircraft 
from cities as a terrorism defense.
http://www.salon.com/tech/col/smith/2006/10/20/askthepilot205/index.html 
or http://tinyurl.com/yh7nz6

Blog entry URL:
http://www.schneier.com/blog/archives/2006/10/perceived_risk.html

Doonesbury on terrorism and fear:
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061015 
or http://tinyurl.com/yffbq4
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061016 
or http://tinyurl.com/ylwj4j
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061017 
or http://tinyurl.com/y76864
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061018 
or http://tinyurl.com/yfq9sp
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061019 
or http://tinyurl.com/ye2km5
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061020 
or http://tinyurl.com/yfbne8
http://www.doonesbury.com/strip/dailydose/index.html?uc_full_date=20061021 
or http://tinyurl.com/yefhz7

Really interesting article about online hacker forums, especially the 
politics that goes on in them.
http://www.usatoday.com/tech/news/computersecurity/infotheft/2006-10-11-cybe
rcrime-hacker-forums_x.htm 
or http://tinyurl.com/y8jqbv

Real-world social engineering crime:
http://www.theregister.co.uk/2006/10/20/easynet_brick_lane_robbery/

Here's another social-engineering story (link in Turkish).  The police 
receive an anonymous emergency call from someone claiming to have 
planted an explosive in the Haydarpasa Numune Hospital.  They evacuate 
the hospital (100 patients plus doctors, staff, visitors, etc.) and 
search the place for two hours.  They find nothing.  When patients and 
visitors return, they realize that their valuables were stolen.
http://www.milliyet.com.tr/2006/10/25/yasam/ayas.html

Paramedic stopped at airport security for nitroglycerine residue.  (At 
least we know those chemical-residue detectors are working.)
http://dochazmat.livejournal.com/31044.html

If you have control of a network of computers -- by infecting them with 
some sort of malware -- the hard part is controlling that network. 
Traditionally, these computers (called zombies) are controlled via IRC. 
  But IRC can be detected and blocked, so the hackers have adapted:
http://news.com.com/Zombies+try+to+blend+in+with+the+crowd/2100-7349_3-61273
04.html 
or http://tinyurl.com/swhbg
The trick here is to not let the computer's legitimate owner know that 
someone else is controlling it.  It's an arms race between attacker and 
defender.

Tamper-evident seals:
http://www.schneier.com/blog/archives/2006/10/tamperevident_s.html
http://pearl1.lanl.gov/seals/default.htm

Microsoft's Privacy Guidelines for Developing Software and Services. 
It's actually pretty good:
http://www.microsoft.com/downloads/details.aspx?FamilyID=c48cf80f-6e87-48f5-
83ec-a18d1ad2fc1f&displaylang=en 
or http://tinyurl.com/y45oge

Canadian "Guidelines for Identification and Authentication," released by 
the Canadian Privacy Commissioner, is a good document discussing both 
privacy risks and security threats.
http://www.privcom.gc.ca/information/guide/auth_061013_e.asp

And here's a longer document published in 2004 by Industry Canada: 
"Principles for Electronic Authentication."
http://e-com.ic.gc.ca/epic/internet/inecic-ceac.nsf/en/h_gv00240e.html

Blog entry URL:
http://www.schneier.com/blog/archives/2006/10/canadian_guidel.html

Surveillance as performance art:
http://www.worldchanging.com/archives/005105.html
This is extreme, but the level of surveillance is likely to be the norm. 
  It won't be on a public website available to everyone, but it will be 
available to governments and corporations.

"Mother Jones" article on Google and privacy:
http://www.motherjones.com/news/feature/2006/11/google.html

They may be great at keeping you from taking your bottle of water onto 
the plane, but when it comes to catching actual bombs and guns they not 
very good:  "Screeners at Newark Liberty International Airport, one of 
the starting points for the Sept. 11 hijackers, failed 20 of 22 security 
tests conducted by undercover U.S. agents last week, missing concealed 
bombs and guns at checkpoints throughout the major air hub's three 
terminals, according to federal security officials."
http://www.rawstory.com/showoutarticle.php?src=http%3A%2F%2Fseattletimes.nws
ource.com%2Fhtml%2Fnationworld%2F2003327485_screeners28.html 
or http://tinyurl.com/yfpogf
As I've written before, this is actually a very hard problem to solve:
http://www.schneier.com/blog/archives/2006/03/airport_passeng.html
Remember this truism:  We can't keep weapons out of prisons. We can't 
possibly keep them out of airports.

The Data Privacy and Integrity Advisory Committee of the Department of 
Homeland Security recommended against putting RFID chips in identity 
cards.  It's only a draft report, but what it says is so controversial 
that a vote on the final report is being delayed.
http://www.dhs.gov/xlibrary/assets/privacy/privacy_advcom_rpt_rfid_draft.pdf

or http://tinyurl.com/y3k2w6
http://www.wired.com/news/technology/1,72019-0.html

Online ID theft hyped, to on one's surprise:
http://www.computerworld.com/action/article.do?command=viewArticleBasic&arti
cleId=9004429 
or http://tinyurl.com/y8mvoz

CEO arrested for stealing the identities of his employees:
http://www.varbusiness.com/sections/news/breakingnews.jhtml?articleId=193500
991 
or http://tinyurl.com/y44w9u

This guy wants to give students bullet-proof textbooks to help in the 
case of school shootings.  You can't make this stuff up.
http://www.wbir.com/news/national/story.aspx?storyid=39017

New U.S. Customs database on trucks and travelers.  It's yet another 
massive government surveillance program:
http://arstechnica.com/news.ars/post/20061103-8143.html
http://edocket.access.gpo.gov/2006/06-9026.htm
http://notabob.blogspot.com/2006/11/in-crosshairs_03.html
http://www.eff.org/deeplinks/archives/004980.php
http://blog.wired.com/27bstroke6/2006/11/homeland_securi.html
http://www.washingtonpost.com/wp-dyn/content/article/2006/11/02/AR2006110201
810.html 
or http://tinyurl.com/yl92on

Classical crypto with lasers.  I simply don't have the physics 
background to evaluate it:
http://www.physorg.com/news80478394.html
http://authors.library.caltech.edu/5655/

On August 18 of last year, the Zotob worm badly infected computers at 
the Department of Homeland Security, particularly the 1,300 workstations 
running the US-VISIT application at border crossings.  Wired News filed 
a Freedom of Information Act request for details, which was denied.  So 
they sued.  Eventually the government was forced to cough up the 
documents.  The details say nothing about the technical details of the 
computer systems, and only point to the incompetence of the DHS in 
handling the incident.
http://www.wired.com/news/technology/0,72051-0.html

Seagate has announced a product called DriveTrust, which provides 
hardware-based encryption on the drive itself.  The technology is 
proprietary, but they use standard algorithms: AES and triple-DES, RSA, 
and SHA-1.  Details on the key management are sketchy, but the system 
requires a pre-boot password and/or combination of biometrics to access 
the disk.  And Seagate is working on some sort of enterprise-wide key 
management system to make it easier to deploy the technology 
company-wide.  The first target market is laptop computers.  No computer 
manufacturer has announced support for DriveTrust yet.
http://www.seagate.com/cda/newsinfo/newsroom/releases/article/0,1121,3347,00
.html 
or http://tinyurl.com/y7tvvd
http://www.pcworld.com/article/id,127701/article.html
http://www.theglobeandmail.com/servlet/story/RTGAM.20061030.wharddrive1029/B
NStory/Technology/?page=rss&id=RTGAM.20061030.wharddrive1029 
or http://tinyurl.com/y5twtg
http://news.com.com/Seagate+bakes+security+into+hard-disk+drive/2100-1029_3-
6130824.html 
or http://tinyurl.com/y4kzhk
http://www.cio.com/blog_view.html?CID=26159
http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2006/10/30/BUGU2M1ETT1.DTL 
or http://tinyurl.com/yjvac7

It's easy to skim personal information off an RFID credit card.
http://www.nytimes.com/2006/10/23/business/23card.html
http://www.theregister.co.uk/2006/10/24/rfid_credit_card_hack/
http://www.rfidjournal.com/article/articleview/2749/1/1/

Why management doesn't get IT security:
http://www.schneier.com/blog/archives/2006/11/why_management.html

"Keyboards and Covert Channels."  Interesting research.
http://www.crypto.com/papers/jbug-Usenix06-final.pdf

"Deconstructing Information Warfare"
http://www.information-retrieval.info/PIW/deconstructing/Taipale-IW-103006.p
df 
or http://tinyurl.com/y2x9vt

The Future of Identity in the Information Society (FIDIS) hates RFID 
passports:
http://www.fidis.net/press-events/press-releases/budapest-declaration/ 
http://it.slashdot.org/it/06/11/09/1757202.shtml or 
http://tinyurl.com/y4eht7

Good essay on data mining.
http://www.theregister.co.uk/2006/11/08/guilty_associations/

Cryptography comic: Alice, Bob, and Eve.  (I get a mention, too.)
http://xkcd.com/c177.html

UK car rentals to require fingerprints.  Not optional, required.
http://www.schneier.com/blog/archives/2006/11/uk_car_rentals.html
http://news.bbc.co.uk/1/hi/magazine/6129084.stm

A classified Wikipedia for the U.S. intelligence services:
http://news.yahoo.com/s/nm/20061031/wr_nm/internet_intelligence_dc_1


** *** ***** ******* *********** *************

      The Death of Ephemeral Conversation



The political firestorm over former U.S. Rep. Mark Foley's salacious 
instant messages hides another issue, one about privacy. We are rapidly 
turning into a society where our intimate conversations can be saved and 
made public later. This represents an enormous loss of freedom and 
liberty, and the only way to solve the problem is through legislation.

Everyday conversation used to be ephemeral. Whether face-to-face or by 
phone, we could be reasonably sure that what we said disappeared as soon 
as we said it. Of course, organized crime bosses worried about phone 
taps and room bugs, but that was the exception. Privacy was the default 
assumption.

This has changed. We now type our casual conversations. We chat in 
e-mail, with instant messages on our computer and SMS messages on our 
cell phones, and in comments on social networking Web sites like 
Friendster, LiveJournal, and MySpace. These conversations -- with 
friends, lovers, colleagues, fellow employees -- are not ephemeral; they 
leave their own electronic trails.

We know this intellectually, but we haven't truly internalized it. We 
type on, engrossed in conversation, forgetting that we're being recorded.

Foley's instant messages were saved by the young men he talked to, but 
they could have also been saved by the instant messaging service. There 
are tools that allow both businesses and government agencies to monitor 
and log IM conversations. E-mail can be saved by your ISP or by the IT 
department in your corporation. Gmail, for example, saves everything, 
even if you delete it.

And these conversations can come back to haunt people -- in criminal 
prosecutions, divorce proceedings or simply as embarrassing disclosures. 
During the 1998 Microsoft anti-trust trial, the prosecution pored over 
masses of e-mail, looking for a smoking gun. Of course they found 
things; everyone says things in conversation that, taken out of context, 
can prove anything.

The moral is clear: If you type it and send it, prepare to explain it in 
public later.

And voice is no longer a refuge. Face-to-face conversations are still 
safe, but we know that the National Security Agency is monitoring 
everyone's international phone calls. (They said nothing about SMS 
messages, but one can assume they were monitoring those too.) Routine 
recording of phone conversations is still rare -- certainly the NSA has 
the capability -- but will become more common as telephone calls 
continue migrating to the IP network.

If you find this disturbing, you should. Fewer conversations are 
ephemeral, and we're losing control over the data. We trust our ISPs, 
employers and cell phone companies with our privacy, but again and again 
they've proven they can't be trusted. Identity thieves routinely gain 
access to these repositories of our information. Paris Hilton and other 
celebrities have been the victims of hackers breaking into their cell 
phone providers' networks.  Google reads our Gmail and inserts 
context-dependent ads.

Even worse, normal constitutional protections don't apply to much of 
this. The police need a court-issued warrant to search our papers or 
eavesdrop on our communications, but can simply issue a subpoena -- or 
ask nicely or threateningly -- for data of ours that is held by a third 
party, including stored copies of our communications.

The Justice Department wants to make this problem even worse, by forcing 
ISPs and others to save our communications -- just in case we're someday 
the target of an investigation. This is not only bad privacy and 
security, it's a blow to our liberty as well. A world without ephemeral 
conversation is a world without freedom.

We can't turn back technology; electronic communications are here to 
stay. But as technology makes our conversations less ephemeral, we need 
laws to step in and safeguard our privacy. We need a comprehensive data 
privacy law, protecting our data and communications regardless of where 
it is stored or how it is processed. We need laws forcing companies to 
keep it private and to delete it as soon as it is no longer needed.

And we need to remember, whenever we type and send, we're being watched.

Foley is an anomaly. Most of us do not send instant messages in order to 
solicit sex with minors. Law enforcement might have a legitimate need to 
access Foley's IMs, e-mails and cell phone calling logs, but that's why 
there are warrants supported by probable cause--they help ensure that 
investigations are properly focused on suspected pedophiles, terrorists 
and other criminals. We saw this in the recent UK terrorist arrests; 
focused investigations on suspected terrorists foiled the plot, not 
broad surveillance of everyone without probable cause.

Without legal privacy protections, the world becomes one giant airport 
security area, where the slightest joke -- or comment made years before 
-- lands you in hot water. The world becomes one giant market-research 
study, where we are all life-long subjects. The world becomes a police 
state, where we all are assumed to be Foleys and terrorists in the eyes 
of the government.

This essay originally appeared on Forbes.com:
http://www.forbes.com/security/2006/10/18/nsa-im-foley-tech-security-cx_bs_1
018security.html 
or http://tinyurl.com/ymmnee


** *** ***** ******* *********** *************

      Airline Passenger Profiling for Profit



I have previously written and spoken about the privacy threats that come 
from the confluence of government and corporate interests.  It's not the 
deliberate police-state privacy invasions from governments that worry 
me, but the normal-business privacy invasions by corporations -- and how 
corporate privacy invasions pave the way for government privacy 
invasions and vice versa.

The U.S. government's airline passenger profiling system was called 
Secure Flight, and I've written about it extensively.  At one point, the 
system was going to perform automatic background checks on all 
passengers based on both government and commercial databases -- credit 
card databases, phone records, whatever -- and assign everyone a "risk 
score" based on the data. Those with a higher risk score would be 
searched more thoroughly than those with a lower risk score. It's a 
complete waste of time, and a huge invasion of privacy, and the last 
time I paid attention it had been scrapped.

But the very same system that is useless at picking terrorists out of 
passenger lists is probably very good at identifying consumers. So what 
the government rightly decided not to do, the start-up corporation 
Jetera is doing instead:

"Jetera would start with an airline's information on individual 
passengers on board a given flight, drawing the name, address, credit 
card number and loyalty club status from reservations data. Through a 
process, for which it seeks a patent, the company would match the 
passenger's identification data with the mountains of information about 
him or her available at one of the mammoth credit bureaus, which 
maintain separately managed marketing as well as credit information. 
Jetera would tap into the marketing side, showing consumer demographics, 
purchases, interests, attitudes and the like.

"Jetera's data manipulation would shape the entertainment made available 
to each passenger during a flight. The passenger who subscribes to a 
do-it-yourself magazine might be offered a video on woodworking. Catalog 
purchase records would boost some offerings and downplay others. Sports 
fans, known through their subscriptions, credit card ticket-buying or 
booster club memberships, would get 'The Natural' instead of 'Pretty 
Woman.'"

The article is dated August 21, 2006 and is subscriber-only. Most of it 
talks about the revenue potential of the model, the funding the company 
received, and the talks it has had with anonymous airlines. No airline 
has signed up for the service yet, which would not only include 
in-flight personalization but pre- and post-flight mailings and other 
personalized services. Privacy is dealt with at the end of the article:

"Jetera sees two legal issues regarding privacy and resolves both in its 
favor. Nothing Jetera intends to do would violate federal law or airline 
privacy policies as expressed on their websites. In terms of customer 
perceptions, Jetera doesn't intend to abuse anyone's privacy and will 
have an 'opt-out' opportunity at the point where passengers make 
inflight entertainment choices.

"If an airline wants an opt-out feature at some other point in the 
process, Jetera will work to provide one, McChesney says. Privacy and 
customer service will be an issue for each airline, and Jetera will 
adapt specifically to each."

The U.S. government already collects data from the phone company, from 
hotels and rental-car companies, and from airlines. How long before it 
piggy backs onto this system?

The other side to this is in the news, too: commercial databases using 
government data:

"Records once held only in paper form by law enforcement agencies, 
courts and corrections departments are now routinely digitized and sold 
in bulk to the private sector. Some commercial databases now contain 
more than 100 million criminal records. They are updated only fitfully, 
and expunged records now often turn up in criminal background checks 
ordered by employers and landlords."

http://www.aviationnow.com/search/AvnowSearchResult.do?reference=xml/awst_xm
l/2006/08/21/AW_08_21_2006_P55-56-01.xml&query=jetera 
or http://tinyurl.com/tt59x
http://www.nytimes.com/2006/10/17/us/17expunge.html

My previous writings:
http://www.schneier.com/blog/archives/2006/03/the_future_of_p.html
http://www.schneier.com/blog/archives/2005/09/secure_flight_n_1.html


** *** ***** ******* *********** *************

      Counterpane News



BT Acquires Counterpane:

On October 25, British Telecom announced that it acquired Counterpane 
Internet Security, Inc.

This is something I've been working on for about a year, and I'm 
thrilled that it has finally come to pass.

http://www.btplc.com/News/Articles/Showarticle.cfm?ArticleID=386c1b2f-0860-4
afc-8f4a-26a066c12d10 
or http://tinyurl.com/yzmtn3

Newspapers:
http://today.reuters.com/news/articleinvesting.aspx?view=CN&storyID=2006-10-
25T071554Z_01_L25202546_RTRIDST_0_TELECOMS-COUNTERPANE-BT-UPDATE-1.XML&rpc=6
6&type=qcna 
or http://tinyurl.com/y28vr3
http://news.bbc.co.uk/1/hi/business/6083818.stm
http://www.businessweek.com/ap/financialnews/D8KVRV6O1.htm or 
http://tinyurl.com/ylmw5f
http://business.timesonline.co.uk/article/0,,13129-2422003,00.html
http://business.guardian.co.uk/story/0,,1930942,00.html
http://www.iht.com/articles/ap/2006/10/25/business/EU_FIN_COM_Britain_BT_Gro
up.php 
or http://tinyurl.com/vxhvp
http://www.mercurynews.com/mld/mercurynews/business/technology/15847133.htm 
or http://tinyurl.com/wba9a
http://www.smh.com.au/news/TECHNOLOGY/BT-buys-security-specialist-Counterpan
e-cofounded-by-cryptologistSchneier/2006/10/26/1161749214324.html 
or http://tinyurl.com/y8632k
http://www.twincities.com/mld/twincities/15848925.htm

Trade and news media:
http://news.com.com/BT+snaps+up+Counterpane+Internet+Security/2100-1002_3-61
29284.html 
or http://tinyurl.com/y5hzeh
http://news.zdnet.com/2100-1009_22-6129284.html
http://www.redherring.com/Article.aspx?a=19374&hed=BT+Snags+Counterpane&sect
or=Industries&subsector=Communications 
or http://tinyurl.com/yxx6ej
http://www.scmagazine.com/uk/news/article/600346/bt-acquires-counterpane-sec
urity/ 
or http://tinyurl.com/v7rmh
http://www.itweek.co.uk/vnunet/news/2167238/bt-buys-security-outsourcer 
or http://tinyurl.com/uq88x
http://www.networkworld.com/news/2006/102506-bt-buys.html
http://www.techworld.com/security/news/index.cfm?newsID=7188&pagtype=all 
or http://tinyurl.com/y7dzzf
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=193402188 
or http://tinyurl.com/smvda
http://www.ovum.com/news/euronews.asp?id=5014
http://news.moneycentral.msn.com/provider/providerarticle.asp?feed=OBR&Date=
20061025&ID=6133629 
or http://tinyurl.com/y3lzaj

Foreign press:
http://www.theage.com.au/news/Technology/BT-buys-security-specialist-Counter
pane-cofounded-by-cryptologistSchneier/2006/10/26/1161749214324.html 
or http://tinyurl.com/y36vt2
http://press-releases.techwhack.com/5016/counterpane-bt/
http://www.metimes.com/storyview.php?StoryID=20061025-074107-7311r
http://www.canada.com/topics/technology/news/gizmos/story.html?id=a177c27c-b
5c6-4eb9-be30-a96cf83b8ed0&k=50643 
or http://tinyurl.com/y4wal4
http://www.breakingnews.ie/2006/10/25/story282489.html
http://www.euro2day.gr/articlesfna/22917952/
http://www.net-security.org/secworld.php?id=4334

British tabloids:
http://www.thesun.co.uk/article/0,,11039-2006490511,00.html
http://www.mirror.co.uk/news/tm_headline=bt-in-code-war-&method=full&objecti
d=17992435&siteid=94762-name_page.html 
or http://tinyurl.com/y2aqxy

Best blog comment ever:
http://www.schneier.com/blog/archives/2006/10/bt_acquires_cou.html#c121821 
or http://tinyurl.com/ug2oo

Commentary from one of our investors:
http://whohastimeforthis.blogspot.com/2006/11/british-telecom-dials-up-da-vi
nci-code.html 
or http://tinyurl.com/y6rdm3

Blog entry URL:
http://www.schneier.com/blog/archives/2006/10/bt_acquires_cou.html


** *** ***** ******* *********** *************

      Architecture and Security



You've seen them: those large concrete blocks in front of skyscrapers, 
monuments and government buildings, designed to protect against car and 
truck bombs. They sprang up like weeds in the months after 9/11, but the 
idea is much older. The prettier ones doubled as planters; the uglier 
ones just stood there.

Form follows function. From medieval castles to modern airports, 
security concerns have always influenced architecture. Castles appeared 
during the reign of King Stephen of England because they were the best 
way to defend the land and there wasn't a strong king to put any limits 
on castle-building. But castle design changed over the centuries in 
response to both innovations in warfare and politics, from 
motte-and-bailey to concentric design in the late medieval period to 
entirely decorative castles in the 19th century.

These changes were expensive. The problem is that architecture tends 
toward permanence, while security threats change much faster. Something 
that seemed a good idea when a building was designed might make little 
sense a century -- or even a decade -- later. But by then it's hard to 
undo those architectural decisions.

When Syracuse University built a new campus in the mid-1970s, the 
student protests of the late 1960s were fresh on everybody's mind. So 
the architects designed a college without the open greens of traditional 
college campuses. It's now 30 years later, but Syracuse University is 
stuck defending itself against an obsolete threat.

Similarly, hotel entries in Montreal were elevated above street level in 
the 1970s, in response to security worries about Quebecois separatists. 
Today the threat is gone, but those older hotels continue to be 
maddeningly difficult to navigate.

Also in the 1970s, the Israeli consulate in New York built a unique 
security system: a two-door vestibule that allowed guards to identify 
visitors and control building access.  Now this kind of entryway is 
widespread, and buildings with it will remain unwelcoming long after the 
threat is gone.

The same thing can be seen in cyberspace as well. In his book, "Code and 
Other Laws of Cyberspace," Lawrence Lessig describes how decisions about 
technological infrastructure -- the architecture of the internet -- 
become embedded and then impracticable to change. Whether it's 
technologies to prevent file copying, limit anonymity, record our 
digital habits for later investigation or reduce interoperability and 
strengthen monopoly positions, once technologies based on these security 
concerns become standard it will take decades to undo them.

It's dangerously shortsighted to make architectural decisions based on 
the threat of the moment without regard to the long-term consequences of 
those decisions.

Concrete building barriers are an exception: They're removable. They 
started appearing in Washington, D.C., in 1983, after the truck bombing 
of the Marines barracks in Beirut. After 9/11, they were a sort of 
bizarre status symbol: They proved your building was important enough to 
deserve protection. In New York City alone, more than 50 buildings were 
protected in this fashion.

Today, they're slowly coming down. Studies have found they impede 
traffic flow, turn into giant ashtrays and can pose a security risk by 
becoming flying shrapnel if exploded.

We should be thankful they can be removed, and did not end up as 
permanent aspects of our cities' architecture. We won't be so lucky with 
some of the design decisions we're seeing about internet architecture.

This essay originally appeared in Wired.com.
http://www.wired.com/news/columns/0,71968-0.html

Concrete barriers coming down in New York:
http://www.nytimes.com/2006/10/07/nyregion/nyregionspecial3/07bollard.html 
or http://tinyurl.com/y6v2xw

Activism-restricting architecture at the University of Texas:
http://www.utwatch.org/archives/polemicist/utarchitectureandactivism_may1990
.html 
or http://tinyurl.com/stby3

Commentary from the Architectures of Control in Design Blog.
http://architectures.danlockton.co.uk/?p=145


** *** ***** ******* *********** *************

      The Doghouse: Skylark Utilities



I'll just quote this bit: "Files are encrypted in place using the 
524,288 Bit cipher SCC, better know [sic] as the king of ciphers."

http://www.skylarkutilities.com/encode-it/home.html

For reference, here's my snake oil guide from 1999.
http://www.schneier.com/crypto-gram-9902.html#snakeoil


** *** ***** ******* *********** *************

      Heathrow Tests Biometric ID



Heathrow airport is testing an iris scan biometric machine to identify 
passengers at customs.

I've written previously about biometrics: when they work and when they 
fail:  "Biometrics are powerful and useful, but they are not keys. They 
are useful in situations where there is a trusted path from the reader 
to the verifier; in those cases all you need is a unique identifier. 
They are not useful when you need the characteristics of a key: secrecy, 
randomness, the ability to update or destroy. Biometrics are unique 
identifiers, but they are not secrets."

The system under trial at Heathrow is a good use of biometrics.  There's 
a trusted path from the person through the reader to the verifier; 
attempts to use fake eyeballs will be immediately obvious and 
suspicious.  The verifier is being asked to match a biometric with a 
specific reference, and not to figure out who the person is from his or 
her biometric.  There's no need for secrecy or randomness; it's not 
being used as a key.  And it has the potential to really speed up 
customs lines.

http://news.bbc.co.uk/1/hi/uk/1808187.stm
http://www.schneier.com/crypto-gram-9808.html#biometrics


** *** ***** ******* *********** *************

      Please Stop My Car



Residents of Prescott Valley are being invited to register their car if 
they don't drive in the middle of the night.  Police will then stop 
those cars if they are on the road at that time, under the assumption 
that they're stolen:

"The Watch Your Car decal program is a voluntary program whereby vehicle 
owners enroll their vehicles with the AATA. The vehicle is then entered 
into a special database, developed and maintained by the AATA, which is 
directly linked to the Motor Vehicle Division (MVD).

"Participants then display the Watch Your Car decals in the front and 
rear windows of their vehicle. By displaying the decals, vehicle owners 
convey to law enforcement officials that their vehicle is not usually in 
use between the hours of 1:00 AM and 5:00 AM, when the majority of 
thefts occur.

"If a police officer witnesses the vehicle in operation between these 
hours, they have the authority to pull it over and question the driver. 
With access to the MVD database, the officer will be able to determine 
if the vehicle has been stolen, or not. The program also allows law 
enforcement officials to notify the vehicle's owner immediately upon 
determination that it is being illegally operated."

This program is entirely optional, but there's a serious externality. 
If the police spend time chasing false alarms, they're not available for 
other police business.  If the town charged car owners a fine for each 
false alarm, I would have no problems with this program.  It doesn't 
have to be a large fine, but it has to be enough to offset the cost to 
the town.  It's no different than police departments charging homeowners 
for false burglar alarms, when the alarm systems are automatically 
hooked into the police stations.

http://www.pvaz.net/Services/police/watchyourcar.htm


** *** ***** ******* *********** *************

      Air Cargo Security



BBC is reported a "major" hole in air cargo security.  Basically, cargo 
is being flown on passenger planes without being screened.  A would-be 
terrorist could therefore blow up a passenger plane by shipping a bomb 
via FedEx.

In general, cargo deserves much less security scrutiny than passengers. 
  Here's the reasoning:

Cargo planes are much less of a terrorist risk than passenger planes, 
because terrorism is about innocents dying.  Blowing up a planeload of 
FedEx packages is annoying, but not nearly as terrorizing as blowing up 
a planeload of tourists.  Hence, the security around air cargo doesn't 
have to be as strict.

Given that, if most air cargo flies around on cargo planes, then it's 
okay for some small amount -- assuming it's random and assuming the 
shipper doesn't know which packages beforehand -- of cargo to fly as 
baggage on passenger planes.  A would-be terrorist would be better off 
taking his bomb and blowing up a bus than shipping it and hoping it 
might possibly be put on a passenger plane.

At least, that's the theory.  But theory and practice are different.

The British system involves "known shippers":

"Under a system called "known shipper" or "known consignor" companies 
which have been security vetted by government appointed agents can send 
parcels by air, which do not have to be subjected to any further 
security checks.

"Unless a package from a known shipper arouses suspicion or is subject 
to a random search it is taken on trust that its contents are safe."

But:

"Captain Gary Boettcher, president of the US Coalition Of Airline Pilots 
Associations, says the 'known shipper' system 'is probably the weakest 
part of the cargo security today.'

"'There are approx 1.5 million known shippers in the US. There are 
thousands of freight forwarders. Anywhere down the line packages can be 
intercepted at these organisations,' he said.

"'Even reliable respectable organisations, you really don't know who is 
in the warehouse, who is tampering with packages, putting parcels 
together.'"

This system has already been exploited by drug smugglers:

"Mr Adeyemi brought pounds of cocaine into Britain unchecked by air 
cargo, transported from the US by the Federal Express courier company. 
He did not have to pay the postage.

"This was made possible because he managed to illegally buy the 
confidential Fed Ex account numbers of reputable and security cleared 
companies from a former employee.

"An accomplice in the US was able to put the account numbers on drugs 
parcels which, as they appeared to have been sent by known shippers, 
arrived unchecked at Stansted Airport.

"When police later contacted the companies whose accounts and security 
clearance had been so abused they discovered they had suspected nothing."

And it's not clear that a terrorist can't figure out which shipments are 
likely to be put on passenger aircraft:

"However several large companies such as FedEx and UPS offer clients the 
chance to follow the progress of their parcels online.

"This is a facility that Chris Yates, an expert on airline security for 
Jane's Transport, says could be exploited by terrorists.

"'From these you can get a fair indication when that package is in the 
air, if you are looking to get a package into New York from Heathrow at 
a given time of day.'"

And BBC reports that 70% of cargo is shipped on passenger planes.  That 
seems like too high a number.

If we had infinite budget, of course we'd screen all air cargo.  But we 
don't, and it's a reasonable trade-off to ignore cargo planes and 
concentrate on passenger planes.  But there are some awfully big holes 
in this system.

http://news.bbc.co.uk/2/hi/americas/6059742.stm


** *** ***** ******* *********** *************

      Cheyenne Mountain Retired



Cheyenne Mountain was the United States' underground command post, 
designed to survive a direct hit from a nuclear warhead.  It's a Cold 
War relic -- built in the 1960s -- and retiring the site is probably a 
good idea.  But this paragraph gives me pause:

"Keating said the new control room, in contrast, could be damaged if a 
terrorist commandeered a jumbo jet and somehow knew exactly where to 
crash it. But 'how unlikely is that? We think very,' Keating said."

I agree that this is an unlikely terrorist target, but still.

http://apnews.myway.com//article/20061016/D8KPU1C02.html


** *** ***** ******* *********** *************

      Comments from Readers



There are hundreds of comments -- many of them interesting -- on these 
topics on my blog. Search for the story you want to comment on, and join 
in.

http://www.schneier.com/blog


** *** ***** ******* *********** *************

CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, 
insights, and commentaries on security: computer and otherwise.  You can 
subscribe, unsubscribe, or change your address on the Web at 
<http://www.schneier.com/crypto-gram.html>.  Back issues are also 
available at that URL.

Comments on CRYPTO-GRAM should be sent to schneier@xxxxxxxxxxxxxxxx 
Permission to print comments is assumed unless otherwise stated. 
Comments may be edited for length and clarity.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to 
colleagues and friends who will find it valuable.  Permission is also 
granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier.  Schneier is the author of the 
best sellers "Beyond Fear," "Secrets and Lies," and "Applied 
Cryptography," and an inventor of the Blowfish and Twofish algorithms. 
He is founder and CTO of Counterpane Internet Security Inc., and is a 
member of the Advisory Board of the Electronic Privacy Information 
Center (EPIC).  He is a frequent writer and lecturer on security topics. 
  See <http://www.schneier.com>.

Counterpane is the world's leading protector of networked information - 
the inventor of outsourced security monitoring and the foremost 
authority on effective mitigation of emerging IT threats. Counterpane 
protects networks for Fortune 1000 companies and governments world-wide. 
  See <http://www.counterpane.com>.

Crypto-Gram is a personal newsletter.  Opinions expressed are not 
necessarily those of Counterpane Internet Security, Inc.

Copyright (c) 2006 by Bruce Schneier.