2015-29-January-Paranoia.jpg

One of my favorite songs is "Destroyer" from The Kinks. It's one of the first songs I remember from my early radio days and it fit my mindset at that time.

There's a red, under my bed
And there's a little green man in my head
And he said, 'you're not goin' crazy, you're just a bit sad
'Cause there's a man in ya, gnawin' ya, tearin' ya into two'

Silly boy ya' self-destroyer
Paranoia, the destroyer

Every second or third project that I'm on, I hear that song in my head. It starts when a client is describing a process that includes decades of checks and cross-checks that have been added over time. Each requirement probably has an interesting story behind it, but the stories are lost.

Why red ink? Why above and to the right? Why is a second copy of the report printed 24 hours after the first copy? Why do you password protect your documents, have the CMS encrypt them and have the storage device encrypt the file? Why must the request be buried in soft peat for three months and recycled as firelighters?

While I may not have asked that final question on a project, my career isn’t over yet. One of those questions ran through my head while talking to a US Federal client. It wasn’t the NSA, but as Edward Snowden proved, no amount of paranoia can stop someone who is inside your systems.

Security is important. Ask any employee of Sony Pictures if you have any doubts. Paranoia is taking risk management to an excessive level. Paranoia has its place in building requirements and making design decisions, but paranoia needs to be toned down by reality.

Diminishing Returns

I recently worked on mapping a process that handled payments. The actions were tracked in a home grown system. Each day, the system spit out a report detailing the payments for the day. At the conclusion of the week, someone would collect all the reports and check them against a weekly report, produced from the same system. Once validated, printed, signed and scanned, the reports would be sent to accounting with a second run of the weekly report, a feed for ADP, and a spreadsheet to match the confirmation report that ADP sends back with the weekly report.

All of these validation systems and spreadsheets run macros and code written and maintained by IT. They were created to make sure that IT doesn’t commit fraud. Let me make this clear: Accounting is using tools created by IT to make sure that nobody in IT can cheat the system and redirect a payment.

Forget that the person waiting for payment would likely complain if they didn’t receive it within two to three weeks. Forget that if IT wanted to commit fraud they would just have to update the underlying database before a daily report was generated. Forget the fact that people are manually checking hundreds of payments every day and potentially thousands per week. Forget that the payments are typically less than $100.

Remember that last detail. A huge process is in place that's designed to prevent fraud that would be caught in short order, cost someone their career and likely result in criminal charges. All because someone in IT decides to risk it all for $100.

Sure. Why not?

There should be checks in place to make sure that everyone that has earned a payment gets a payment. The totals should match. Controls should be in place to limit access to the data. Adding 20 hours of labor per week in order to prevent fraud of $100 is excessive.

You Can’t Design Perfection

Mistakes will happen in any process that includes human involvement. Sure, there need to be checks in place. Yes, you want to limit fraud as much as possible.

The answer isn’t five levels of validation. The answer lies in automation. The answer lies in watching for strange user behaviors. The answer lies in removing humans from the data entry and having information originate digitally. The answer lies in validating that the total amount of payments matches what was earned in total. The answer lies in making sure that nobody takes security for granted.

When there is an issue, the natural answer is to ask what extra controls could have prevented the problem from occurring. The other question that needs to be asked is how technology can be used to detect problems before they're actually problems.

Funnel paranoia to detect issues. Don’t add so many checks and balances that people can no longer do their jobs.

Creative Commons Creative Commons Attribution 2.0 Generic License Title image by  schnappischnap