When I inherited the vendor risk assessment process at Optiv, the instinct — the one I had to consciously override — was to automate what existed.
It would have been faster to start. Map the current steps, identify which ones could be handled by Power Automate, build the flows, ship it. Done in six weeks instead of twelve.
It also would have been wrong.
What was actually there
The process had accumulated workarounds the way old software accumulates technical debt: invisibly, gradually, one reasonable decision at a time.
Someone couldn’t get data out of System A in the right format, so they exported to Excel and reformatted manually. Someone else needed a field that didn’t exist in the form, so they started putting it in the notes column. An approval step required two sign-offs but the workflow only tracked one, so the second lived in someone’s email inbox. Each workaround made sense in the moment. Together they created a process that took three months and was nobody’s fault.
If you automate that, you get a faster version of the same dysfunction. The Excel reformatting happens in a script instead of manually. The notes column gets parsed with fragile regex. The email approval step breaks the automation every time someone is on vacation.
You’ve digitized the workaround. You haven’t fixed the process.
How to tell the difference
The signal I’ve learned to look for: Is this step here because it creates value, or because something upstream failed?
Every workaround is a symptom of an upstream failure. The Excel reformatting exists because the data structure was wrong at the point of collection. The notes column exists because the form wasn’t designed with all the required fields. The email approval exists because the system doesn’t support the actual approval logic.
Before you automate, trace each step back to its origin. If the origin is “we couldn’t do it the right way,” that’s the thing to fix first.
What the redesign conversation looks like
At Optiv, this meant going upstream to the SharePoint form — the very first step in the process — and redesigning it to collect data in the format the downstream steps actually needed. It meant adding the missing fields. It meant building validation at the point of entry so bad data couldn’t propagate.
This took longer than automating what existed. It required conversations with stakeholders who had strong opinions about the form. It required mapping dependencies I hadn’t initially known about.
But when we built the Power Automate flows, they worked. Not just in testing — in production, with real data, under normal operating conditions. Because the process they were automating was actually designed to be automated.
The 92% reduction in processing time came from the redesign as much as the automation. You can’t separate them.
The practical question
When someone comes to me with an automation request, the first thing I ask is: Walk me through the current process, including the things people do outside the system.
The outside-the-system parts are almost always where the workarounds live. That’s the territory you need to understand before you write a single line of automation logic.
The automation conversation is always worth having. The redesign conversation comes first.