Advertisement

SKIP ADVERTISEMENT

Wealth Matters

Can Artificial Intelligence Keep Your Home Secure?

Ken Young, chief executive of Edgeworth Security, whose home security systems use technology like geofencing, facial recognition and A.I.-enabled cameras to help identify intruders.Credit...Ross Mantle for The New York Times

Home security is expected to be a $47.5 billion business by 2020. Top-of-the-line systems can include alarms, cameras, dogs, guards and even secret passageways. But even the most sophisticated systems can have a fundamental flaw: human error.

Now, security companies are hoping to harness the potential of artificial intelligence to better safeguard homes.

Expert say there are risks to using A.I., including concerns about privacy, the collection of personal data and racial sensitivity and bias, but security companies are promising better service at lower prices. Artificial intelligence, they say, can see more things faster than systems that rely on humans, who may not be paying attention.

“We put in the cameras to create a perimeter with no dead zones,” said Ken Young, chief executive of Edgeworth Security, a consulting firm in Pittsburgh that offers monitoring solutions.

To protect a property, these systems use technology like geofencing, facial recognition and A.I.-enabled cameras to help identify intruders. If someone breaks that boundary, the cameras will alert a command center. If someone loiters too long at a call box at the entrance to an estate, the system sends an alert to the monitoring center, which responds with a tailored warning, like “You in the blue shirt, please leave.”

Mr. Young said the system uses artificial intelligence to tell the difference between movement into and out of a property, but it also uses facial recognition technology to distinguish regular visitors — like gardeners or delivery people — from strangers.

“When I worked at the White House, the grounds were gridded out with cables,” said Mr. Young, who was part of the Marine One security detail and served as an emergency action planner to the executive branch during President George W. Bush’s administration. “Now, it’s all done through the lens of the camera.”

Companies like Galaxy Security also make enhanced video cameras like the ones Edgeworth uses, and other security companies offer enhanced video surveillance as an add-on to other camera systems.

The systems that Edgeworth installs can start around $20,000 for eight cameras on a small property and rise to more than $600,000 for large estates. Monitoring costs $8 to $12 an hour, and homeowners can choose when they want the monitoring turned on.

That level of security is a draw for wealthy homeowners and property owners.

The actor Joe Manganiello realized the weakness of his home security system a few years ago. He was at home in Beverly Hills, Calif., with his wife, the actress Sofia Vergara, when he heard someone walking around their property.

Ms. Vergara checked the security cameras and noticed they were blacked out. Two men on their property had been spray-painting the lenses for nearly 45 minutes, which the company monitoring the security feed had missed.

Image
Edgeworth Security’s command center in Pittsburgh. The company offers monitoring solutions using artificial intelligence.Credit...Ross Mantle for The New York Times

“These guys were trying to crowbar in the kitchen window; then they moved to the living room door,” said Mr. Manganiello, who is known for his roles on “True Blood” and “Magic Mike.” “I was standing at the top of the stairs with a weapon.”

When the men broke through the front door, the security alarm sounded and they ran off, he said. But the attempted break-in made him realize it was time for a security upgrade.

Many multimillion-dollar homes are ill equipped from a security perspective, professionals say. According to a 2011 study by the Justice Department, 94 to 98 percent of burglar alarms were false, making the systems unreliable.

Tom Gallagher, president of DSL Construction, which owns 26 residential buildings with more than 1,400 apartments in Los Angeles, said he wanted to change how the properties were protected.

“Over the years, it just became increasingly clear to me that the quality of the guards and the guard services were horrible,” he said. “They weren’t very effective.”

At first, he tried to create his own guard company, but that was too expensive, so he began researching enhanced security systems. He said installing the systems in all of the company’s properties would save $400,000 to $500,000 a year. They will also be more reliable.

“We had cameras out there when we still had guards,” Mr. Gallagher said of his trial phase. “We had an incident that the cameras picked up. Where was the guard? He was sleeping in his car for six hours.”

Thomas Tull, the chief executive of Tulco, which owns Edgeworth, said what he wanted for himself and his clients was a system that anticipated risks, not just responded to them.

He gave as an example a worker in one client’s home who posted a picture of the house online; the Edgeworth security system flagged the photo within a minute, and it was taken down. In another instance, the plans for someone’s compound were detected on the so-called dark web.

“Who knows what they were going to do with it?” Mr. Tull said. “That’s a problem that didn’t exist 20 or 25 years ago, this digital extension of yourself.”

How these systems learn the difference between good behavior and bad is a fraught ethical question.

“There is inherent bias in the computational systems,” said Illah R. Nourbakhsh, the K&L Gates professor of ethics and computational technologies at Carnegie Mellon University’s Create Lab.

A recent study at the M.I.T. Media Lab showed how biases in the real world could seep into artificial intelligence. Commercial software is nearly flawless at telling the gender of white men, researchers found, but not so for darker-skinned women.

Image
Sofia Vergara and Joe Manganiello upgraded their home security system with artificial intelligence after a break-in at their home in Beverly Hills, Calif.Credit...Chris Pizzello/Invision, via Associated Press

And Google had to apologize in 2015 after its image-recognition photo app mistakenly labeled photos of black people as “gorillas.”

Professor Nourbakhsh said that A.I.-enhanced security systems could struggle to determine whether a nonwhite person was arriving as a guest, a worker or an intruder.

One way to parse the system’s bias is to make sure humans are still verifying the images before responding.

“When you take the human out of the loop, you lose the empathetic component,” Professor Nourbakhsh said. “If you keep humans in the loop and use these systems, you get the best of all worlds.”

Security consultants recommend a layered approach that could include artificial intelligence.

Michael A. Silva, principal of Silva Consultants in Seattle, said people needed to do a risk assessment first. Some very wealthy people are relatively unknown, so their risk is low, he said, but a less wealthy person with controversial opinions could be a more prominent target.

Mr. Silva said any security plan started with the basics — good locks, strong doors, an alarm system — and could be expanded to full perimeter screening with either monitoring enhanced with artificial intelligence or more traditional motion detectors and alarms. Celebrities and other well-known people may want to build a safe room in their homes, he said, or have their own command centers.

“Before you start prescribing medicine, you need to diagnose the condition,” Mr. Silva said. “A risk assessment is really crucial.”

Christopher Falkenberg, a former Secret Service agent and the president of Insite Risk Management, said that with threats being made so easily over social media, he needed to help clients control their personal information and who had access to it.

He said his firm used existing technology and had created some of its own programs to track what was being said about clients online.

“We used to be concerned with a small circle of people with information about you — the gardeners, the people who were on the property,” Mr. Falkenberg said. “We can’t vet all the people online the way we used to vet the gardener. We have to talk to clients about controlling the information that they personally put out there.”

At a minimum, what any security program hopes to do is make a home less attractive to criminals.

“We’ll never reduce the crime rate in East Hampton or Greenwich,” Mr. Falkenberg said. But, he added, “if we can make it that much more difficult to target our people, we’ll have achieved our goal.”

A few months ago, Mr. Manganiello and Ms. Vergara’s home was targeted again. But this time, their new system from Edgeworth with geofencing technology and A.I.-enabled cameras detected three men before they could get close to the house.

“As they were trying to figure out where to come in, the command center was guiding the police to our house,” Mr. Manganiello said. “They were able to apprehend them and their getaway driver before they could even touch the house.”

A version of this article appears in print on  , Section B, Page 3 of the New York edition with the headline: Can Artificial Intelligence Keep Your Home Secure?. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT