Humanizing technology: My mission & your responsibility to humanity

Machines are taking over the world as we know it, and I do not feel fine.

Now let’s be clear—this isn’t a James Cameron, Terminator, situation—but we are entering a precarious era in human evolution. This new era, some call it the Fourth Industrial Revolution, marks a dramatic and exponential shift in the way technology influences how people live, work, and experience their worlds.

As leaders and as decision-makers, it’s our responsibility to look for answers beyond business outcomes, project velocity, and innovation for innovation's sake. Far too often, we place too high a premium on the theoretical utility of technology, and completely fail to ask questions about the tech’s impact on the end-user (the humans).

So, whose side are you on: humans or machines?

Playing for the wrong team

From a young age, I knew what I wanted to do: raise people up, empower businesses to take on the next big challenge—and make technology a catalyst for human capability. But when it came time to apply that philosophy to running my own company, I could not have been further removed from those aspirations—I was playing for the wrong team. Here are a few examples.

Always on the clock (a story of mistrust)

When the Softway India side of my company moved to its first large office in Bangalore, we used biometric scanners to regulate and track access in and out of the office. I wanted my people to be safe, and I wanted to protect company property and resources. By this logic, having biometric scanners was a good idea.

But, here’s where I took things too far. As people scanned into the office for work in the morning, the system recorded their arrival times; it also kept record as they went outside for smoke breaks, tea breaks, walks, and a little fresh air. At the end of the month I tallied up time lost to tardiness and tea breaks, and I docked their pay accordingly.

Reading this you can imagine how uncomfortable I made my employees. But at the time, I couldn’t imagine that would be the case. It seemed to make logical sense, and many other tech firms in India enforced the same policies. At first, it even seemed like it was working—tardiness went down, but productivity remained unchanged.

My employees were shocked by the clear mistrust the scanners represented, and it sapped them of their motivation and creativity. The tech completely failed to produce the desired outcomes. In fact, it made things worse. So I removed the scanners, gave up on the practice and mindset, and saw an increase not just in productivity, but in the creative relationships between my employees.

Vacation is relaxing—but not getting paid hurts

To streamline and improve resource tracking, and make requesting paid time off easier for my India team, I introduced software that would run the entire vacation/leave process.

We launched the software. At first, I didn’t hear any complaints or issues, but then I started getting feedback from people who went on vacation. They hadn’t been paid for the time they were out of office. Apparently, my employees didn’t know they were supposed to use the new software, and that it tied payroll and our biometric system together. Because employees on vacation hadn’t scanned into the office, the system assumed they were a no-show. I was furious and confused—then I realized that I failed to implement any sort of communication strategy, rollout plan, or contingency measures in the event users made a mistake.

I didn’t set the tech (or my people) up for success—but how? Why? The tech looked perfect on paper. Newsflash: people are NOT perfect, they make mistakes, and tech needs to be developed with this in mind. Technology needs to be forgiving.

My “perfect” vacation software and lukewarm rollout plan was not empathetic and it was not forgiving—instead it was a gut-punch. Because Softway India is on a 30-day pay cycle, reimbursing employees for backpay took 30 days. They had to scramble to pay bills, use emergency credit cards—people even had their utilities turned off. This was excruciating, but it was something that forever changed how I thought about technology—and more importantly, people.

It’s stories like these that illustrated my bias towards technology, especially new technology. I believed that you could take whatever the brochures and pamphlets said at face value, distribute the technology, and it would be seamlessly picked up and adopted because it’s more “efficient” and “advanced.” You have to consider how it fits into their current environment—and if you’re asking them to change, advocate for that change. This mindset was my failing, but it raised alarms and evolved my understanding of the human condition, and that became my learning.

The human condition

When thinking about new tech, you need to look comprehensively at the environment it will enter, including the human landscape. Just look at the stories I've already mentioned. Biometric scanners are used to boost security and safety, but I instead used them for accountability and productivity. By not using the tech for what it was intended, I created an environment of fear and mistrust. I wanted to use a software to make taking vacation easier for my employees. Instead, I stirred up a tremendous amount of stress and uncertainty by not empathizing with my people and considering the forgivability of technology.

This is the human condition—it’s the behaviors, ambitions, expectations, emotions, wants, and needs that are essential to human satisfaction and success. It was part of the equation that my business was missing, and something I had to learn the hard way. As leaders and decision-makers, we need to be more empathetic to the human condition, and in turn technology will be more forgiving of the human condition.

Human Experience > User Experience (UX)

Think about where our lives were at the beginning of 2020. How did you use technology differently? Do you remember what it felt like to share a space with team members? What did work-life balance look like for you? Answering those questions in today’s terms looks starkly different—from working remotely, to social distancing, to managing projects, home-schooling, and eating dinner simultaneously. The world has shifted and the human condition along with it. Which raises the big question: is your tech being adapted to suit these new human needs?

At Softway we operate with extreme empathy for the humanity around technology—when you start from Love, better solutions follow.

Humans play

Physical therapy is a process, a sometimes long and challenging one that can feel discouraging for patients who don’t see results quickly. A physical therapist and their team recognized this struggle and wanted to work with Softway to find a way to keep patients engaged and motivated in their recovery.

We looked at patients as humans, and asked, “What do humans enjoy?” The answer: Play. To help patients with their at-home exercises, we gamified the process with a piece of wearable tech and an easy-to-use mobile app. The app guided patients through exercises while evoking a sense of play and fun. It even helped visualize and track their progress, leading to better patient outcomes.

Humans are on the move

When people relying on electric wheelchairs experience technical issues, it’s not just their mobility that is impeded, but their ability to experience the world. For one electric wheelchair manufacturer, it took an average of two days to perform necessary service when customers experienced issues with their chairs. But what’s more surprising is that 80% of the time, the error was caused by an issue in the settings.

For those whose wheelchair is their connection to the world, waiting two days is unreal—especially when the issue is a quick, simple fix. To narrow the gap between diagnosis and repair, we built an app that communicates with wheelchairs remotely, allowing those maintaining the tech to identify and resolve issues in minutes, not days. Our solution didn’t just make a more reliable product, but a more reliable, human customer service experience.

Radiologists read CT scans more carefully and feel a stronger sense of empathy when they see a picture of a patient’s face. Chefs cook better food when they can see and empathize with who they are cooking for. Farmers that empathize with the people they grow food for view their jobs as sacred and produce more crops. And technologists create better, more humanized tech when they do so with Love and empathy for the fluidity of the human condition.

Power to the people

Public perception about technology is changing. In addition to expecting cleverer UX, breakneck processing speeds, and snappier algorithms, people are demanding a greater level of humanity and ethics in the tech they use. And that responsibility falls on the companies that make them.

Amazon went from just selling books, to selling books and just about anything you can imagine. In 2016 alone, Amazon had over 12 million different products in inventory, and if you included third-party marketplace sellers—that number increased to over 350 million products. As these numbers grew, so did the number of counterfeit products—creating a lot of mistrust and frustration among customers and sellers. They felt it was something they had to deal with, that it was just the nature of the beast, and that “tech knows best”.

Amazon recognized that the fallout from counterfeit products was on their shoulders—not because they lost money, but because “buyer beware” is not kind to the human condition. In March of 2017, Amazon launched Transparency by Amazon, a service aimed at combating the sale of counterfeit goods by bolstering transparency across the entire buyer and seller journey.

For a modest fee, Transparency by Amazon provides sellers with a unique barcode that allows Amazon to track and authenticate products before they reach customers. Once delivered, customers could scan product barcodes and see: manufacturing dates, manufacturing locations, and additional product information such as ingredients. Aside from protecting company profits and brand image, Transparency by Amazon is a uniquely human tech solution. Transparency begets trust, loyalty, and a sense of security—Amazon empathized with this and built technology that worked for humans, not against them.

The paradigm is shifting, and society as a whole is holding developers responsible for the accidental negative impacts of their technology—that is true for both nascent and matured technology. If you want to be on the right side of this shift, you need to ask more than, “What can this technology do to help people?” but also, “How can this technology potentially harm people?”

Technology fit for people and purpose

Technology is a mirror we hold to humanity. As leaders and decision-makers we have an obligation to create technology that holistically reflects the human condition. We must push forgiveness, empathy, and Love to the forefront, and remember whose side we’re on: humanity’s.