I feel like I’ve written this column before, but man, it needs to be written again. And again. And again. One day, the message is going to make it through…I hope.
Here’s the message, and it might be painful: You aren’t good enough to use AI to write things. You just aren’t.
You know what stands out these days? Human grammatical errors. Un-whitewashed, authentic writing. Sentences that don’t include “it’s not just X, it’s also about Y.” Paragraphs that don’t begin with “In a security environment where X and Y happen…” Groups of sentences that don’t all say the exact same thing three different ways.
Two words.
One sentence.
Many hard returns.
Think about it this way: You drive a car every day. Could you drive NASCAR? Odds are, if you tried, your car would end up crashed into a wall.
When it comes to writing, people are crashing their car into a wall hundreds of times a day. Frankly, from where I sit, it is exhausting.
In the hands of a skilled writer and editor, AI can be an essential tool for writing enhancement. In the vast majority of people’s hands, it’s a tool to make everything generic, repetitive, and a slog.
Here’s the thing: I’m not just talking about writing. Look at what’s happening across the security industry right now. Every manufacturer has AI stamped on something. AI-powered this and AI-enhanced that. It’s everywhere, and integrators are deploying it at speed.
You drive a car every day. Could you drive NASCAR? Odds are, if you tried, your car would end up crashed into a wall.
But how many of them actually understand what’s happening under the hood? How many are configuring these systems properly, training the models on the right environments, tuning detection thresholds so they actually mean something? Or are they just flipping the switch and assuming the technology handles the rest?
Anyone can deploy AI tools for security, but the car crashes into the wall when you end up with a security system that generates 200 false alarms a day.
And just like horrendous AI-generated writing, nobody’s fooled. The end-users aren’t fooled; security directors reviewing system performance aren’t fooled. The only people who seem fooled are the ones who sold it, deployed it, and then walked away.
In the hands of a skilled integrator – one who takes the time to understand technology, configure it for the specific environment, and refine it over time – AI is a legitimate force multiplier. In the hands of someone who treats it as a magic box, it’s just expensive noise.
The parallel is exact: AI amplifies what you bring to it. Bring expertise, and you get something powerful. Bring nothing, and you get polished garbage – whether it comes in the form of a press release or a video system that can’t tell a person from a plastic bag.
Stop fooling yourselves.
About the Author
Paul Rothman
Editor-in-Chief/Security Business
Paul Rothman is Editor-in-Chief of Security Business magazine. Email him your comments and questions at [email protected]. Access the current issue, full archives and apply for a free subscription at www.securitybusinessmag.com.

