Edward Snowden ripped the blinds off the surveillance state last summer with his leak of top-secret National Security Agency documents, forcing a national conversation about spying in the post-9/11 era. However, there's still no concrete proof that America's elite intelligence units are analyzing most Americans' computer and telephone activity — even though they can.
Los Angeles and Southern California police, by contrast, are expanding their use of surveillance technology such as intelligent video analytics, digital biometric identification and military-pedigree software for analyzing and predicting crime. Information on the identity and movements of millions of Southern California residents is being collected and tracked.
In fact, Los Angeles is emerging as a major laboratory for testing and scaling up new police surveillance technologies. The use of military-grade surveillance tools is migrating from places like Fallujah to neighborhoods including Watts and even low-crime areas of the San Fernando Valley, where surveillance cameras are proliferating like California poppies in spring.
The use of militarized surveillance technology appears to be spreading beyond its initial applications during the mid-2000s in high-crime areas to now target narrow, specific crimes such as auto theft. Now, LAPD and the Los Angeles County Sheriff are monitoring the whereabouts of residents whether they have committed a crime or not. The biggest surveillance net is license plate reading technology that records your car's plate number as you pass police cruisers equipped with a rooftop camera, or as you drive past street locations where such cameras are mounted.
The Electronic Frontier Foundation and the American Civil Liberties Union of Southern California are suing LAPD and the Sheriff's Department, demanding to see a sample week's worth of that data in order to get some idea of what cops are storing in a vast and growing, regionally shared database. (See our story “License Plate Recognition Logs Our Lives Long Before We Sin,” June 21, 2012.)
Two dozen police agencies have gathered more than 160 million data points showing the exact whereabouts of L.A.-area drivers on given dates.
Despite growing concerns among privacy-rights groups, LAPD hopes to greatly expand its mass surveillance: The city traffic-camera system — 460 cameras set above major roads and intersections by the Department of Transportation — which now are used to monitor traffic jams, could be folded into LAPD's surveillance network.
Los Angeles City Councilman Mitchell Englander wants to convert the system's video feeds to a digital format used by automated license-plate readers. Despite L.A. crime levels being at historic lows, Englander insists, “It is vital that the LAPD have instant split-feed access to the 460 traffic cameras … and be able to review this data to catch suspected criminals and protect our community.”
LAPD's mild-sounding “predictive policing” technique, introduced by former Chief William Bratton to anticipate where future crime would hit, is actually a sophisticated system developed not by cops but by the U.S. military, based on “insurgent” activity in Iraq and civilian casualty patterns in Afghanistan.
Records obtained by L.A. Weekly from the U.S. Army Research Office show that UCLA professors Jeff Brantingham and Andrea Bertozzi (anthropology and applied mathematics, respectively) in 2009 told the Army that their predictive techniques “will provide the Army with a plethora of new data-intensive predictive algorithms for dealing with insurgents and terrorists abroad.” In a later update to the Army, after they had begun working with LAPD, they wrote, “Terrorist and insurgent activities have a distinct parallel to urban crime.”
This month, LAPD sent a team to Israel, the Jewish Journal reports, to visit drone manufacturers and Nice Systems, a cyber-intelligence firm that can “intercept and instantly analyze video, audio and text-based communications.” Reporter Simone Wilson quoted Horace Frank, commander of LAPD's Information Technology Bureau, as telling an Israeli conference of data intelligence experts: “Let's be honest … We're here to steal some of your great ideas.”
Max Blumenthal, a journalist and fierce critic of Israel, tweeted: “LAPD delegation heads to Israel to learn lessons in control, domination and exclusion.”
With L.A. and most U.S. cities — including those that don't use predictive algorithms and license-plate recognition — enjoying a huge drop in violent crime, some are indeed questioning this liberal city's embrace of war and spy technology.
Ana Muniz, an activist and researcher who works with the Inglewood-based Youth Justice Coalition, says, “Any time that a society's military and domestic police force become more and more similar, where the lines have become blurred, it's not a good story.”
The military is supposed to “defend the territory from so-called external enemies,” Muniz says. “That's not the mission of the police force — they're not supposed to look at the population as an external enemy.”
L.A. residents may not yet grasp that more and more military technology is being aimed at them in the name of fighting routine crime. But Hamid Khan, an Open Society Foundations fellow who studies LAPD surveillance, warns, “Counterinsurgency principles are being incorporated on the local policing level.”
In 2010, LAPD announced a partnership with Motorola Solutions to monitor the Jordan Downs public housing project with surveillance cameras. Then-chief Bratton called it the start of an ambitious buildout to use remote “biometric identification,” which can track individuals citywide.
In a letter to the Los Angeles City Council, Bratton claimed that CCTV “will enable police to respond more effectively to criminal conduct,” and that “facial-recognition technologies will be used to enable law enforcement to more effectively enforce the gang injunction already in place.”
Then in January 2013, LAPD announced the deployment of more than a dozen live-monitored CCTV cameras in the Topanga and Foothill divisions in the San Fernando Valley. The cameras are equipped with facial-recognition software — purportedly programmed to ID people named on “hot lists” for having open warrants or because they were documented as active gang members.
Operated by LAPD's Tactical Technology Section, these cameras feed facial imagery to LAPD's Real-time Analysis and Critical Response Center (RACR), a digital “war room” that also creates up-to-the-minute crime mapping.
RACR opened in early 2012. The Weekly could not obtain a comment from LAPD on the efficacy of its live-monitored facial ID rollout, but the question remains whether all this is necessary.
Raytheon, which last year sold $13 billion in weapons to the Department of Defense, has a major contract with LAPD to outfit patrol vehicles with video cameras.
“Smart video” programs can use facial recognition to ID people by comparing live CCTV footage to mugshot databases built from facial scans collected by police using mobile devices or during bookings. Computer programs also can learn “acceptable behavior” by humans — such as pedestrian or vehicular traffic patterns — and alert cops when something “abnormal”' occurs.
LAPD already is using a sophisticated intelligence-analysis program from Silicon Valley firm Palantir, which is partially funded by In-Q-Tel, the Central Intelligence Agency's venture capital firm. Palantir sells data-mining and analysis software to the NSA and other intelligence agencies.
LAPD did not respond to requests for comment about the Palantir intelligence program. But Peter Bibring, a staff attorney with the ACLU of Southern California, has seen at least one Suspicious Activity Report from LAPD in which investigators used Palantir's intelligence-analysis software to delve into “license plates, leads and suspect profiles.”
(Suspicious Activity Reports, or SARs, are citizen- or police-generated tips about potential terrorist activities; they are controversial because they rely on the “reasonable indication” standard for investigating, rather than the tougher crime standard of “reasonable suspicion.”)
Whitney Richards-Calathes, a doctoral student at the City University of New York, who studies predictive policing and other tech-centric law enforcement trends, warns, “We have to be really critical about the built-in assumptions made when … databases are created for 'public safety,' yet youth as young as 9 and 10 years old are put in these secret databases, automatically labeled as gang members.”
Hamid Khan sees a situation developing in L.A. that might give some people pause: the spread of intelligence gathering by local police, using the lower standard of “reasonable indications” instead of “reasonable suspicion,” thus fueling “a culture of suspicion and fear” in many of L.A.'s nonwhite communities.