Why the future of AI is human-centric software

 
 

Ask anyone about their favourite work software, and you'll likely hear a litany of complaints. From new timesheet systems causing business interruptions to the chaos brought about by the Post Office Horizon debacle, business software seems to garner more disdain than admiration.

The root of the problem lies in the top-down design approach adopted by systems, orchestrated by individuals with minimal concern for those on the front lines. Corporate executives have been lured by the promise of total control, inadvertently turning their organisations into IT departments with businesses as mere afterthoughts.

A few years back, we implemented a new timesheet system, a cutting-edge solution from one of the industry giants. It pledged cost savings, financial control, and a plethora of impressive dashboards. However, the hitch was its complexity, rendering it practically unusable. The IT department dispatched high-priced consultants donning vibrant T-shirts, but their daytime visits conveniently overlooked our nighttime frontline staff. No timesheet submission meant no pay.

Technology often leaves many uncomfortable, and those in the construction industry certainly didn't sign up to spend their days tethered to a computer. Consequently, frontline staff often find themselves daunted and apprehensive due to needlessly intricate and poorly designed systems. Instead of acknowledging failure, global software companies typically deflect responsibility, compelling us to devise our own workarounds.

The Post Office scandal epitomises the nadir of corporate IT failures. The scapegoating of frontline staff and the shielding of those accountable for a flawed system resulted in terminations, bankruptcies, and tragic losses. While acknowledging that software can fail, it underscores the importance of governance and systems to ensure compliance. More crucial, however, is the need to listen to, empower, and support those on the front line who drive your business, allowing them to operate at their full potential.

Today, the advent of AI heralds a new era of business efficiency and insights. However, with numerous consultancies and tech companies peddling AI dreams to C-suite executives, we risk repeating the Post Office disaster. Many purportedly "AI-powered" tech firms struggle to fulfil their promises, often resorting to teams of low-paid offshore staff manually executing tasks under the guise of real-time AI. Unsubstantiated AI-driven insights based on subpar data and poorly written code can lead to erroneous decision-making, resulting in unfair dismissals, financial losses, or worse. This creates a widening trust gap between frontline and office staff, burdening employees with unnecessary and time-consuming administrative tasks to uphold unattainable software promises in oversight, insights, and foresight.

The Post Office scandal should serve as a wake-up call, urging us to hold software companies accountable for their failures instead of perpetuating protection.

AI has the potential to enhance efficiency, speed, and overall performance. In SymTerra, we've successfully integrated AI into various internal tasks over the past two years and explored its applications for our clients. What we're implementing this year may seem modest compared to the grandiose promises made by other tech companies, but we're committed to delivering achievable outcomes rather than making unrealistic commitments to clients.

Companies must recognise the value of their frontline staff and prioritise improving their work lives. Simultaneously, they need to assert control over their data and foster a culture of trust and understanding within the organisation regarding sound data practices and governance. Only then will a business be well-positioned to harness the power of AI; otherwise, it risks damaging its culture, alienating crucial staff, and ultimately only enriching the coffers of AI consultants!

Previous
Previous

WhatsApp on your construction projects: thumbs up or thumbs down?

Next
Next

Top 3 2024 thoughts and views: