Written By Michael Ferrara
Created on 2025-04-03 10:44
Published on 2025-04-03 12:25
I’ve spent enough time around tech to see both sides of the story: the innovation we love to celebrate — and the recklessness we tend to ignore.
Behind the metrics and mission statements, I see something more human: the anxiety of getting laid off after years of loyalty. The anger of watching your data sold off behind your back. The exhaustion of trying to “beat the algorithm” just to get your resume seen. The quiet resignation when founders are applauded for behavior that would get anyone else fired.
People in tech — and affected by tech — are burned out, disillusioned, and increasingly skeptical of the system. And honestly? They have every right to be.
For every bold founder or breakthrough product, there’s a pattern of behavior that makes me pause. Some of it is cultural. Some systemic. But all of it is avoidable.
Here are five things I keep seeing that scream “irresponsible” — and why I think we should stop normalizing them.
Collecting user data without clear, informed consent isn’t just unethical — it’s dangerous. From apps that bury permissions in legalese to smart devices that quietly harvest behavioral patterns, the “ask for forgiveness, not permission” mindset turns privacy into collateral damage.
Many companies justify this by claiming it helps "personalize the experience" — but what they’re really doing is building digital profiles to predict, manipulate, and monetize our behavior. The worst offenders don’t even pretend to ask. They operate on the assumption that silence equals consent.
💬 “If you're not paying for the product, you are the product.”
That line was first coined by Andrew Lewis, a British web developer, in a 2010 MetaFilter post under the pseudonym blue_beetle. But it was Tristan Harris — Google’s former design ethicist and Co-Founder of the Center for Humane Technology — who helped catapult the phrase into the mainstream. Through his talks, interviews, and the documentary The Social Dilemma, Harris turned the warning into a cultural wake-up call.
We’re now living in the world they both warned us about: surveillance capitalism as a business model. And the longer we let it go unchecked, the more normalized digital overreach becomes.
We get it — speed is everything. But when companies push out software riddled with bugs, security flaws, and broken functionality just to meet an arbitrary internal deadline, users pay the price. And it’s not just annoying — it’s sometimes dangerous.
Take Zoom’s early security flaws, which exposed private meetings to "Zoombombing" during the pandemic. Or Robinhood's app outage in 2020, which locked users out of trades during one of the most volatile market days — costing some people thousands. Even Windows 10 updates have occasionally broken printers, network drivers, or entire desktops because patches were rushed and poorly tested.
"Move fast and break things" might’ve sounded edgy at first, but today it reads more like a confession. Minimum Viable Products should be just that — viable, not barely breathing.
You don’t gain user trust by turning them into unpaid QA testers. Especially when the stakes include their money, privacy, or ability to work.
In the tech world, companies often flaunt headcount like a status symbol — until suddenly, thousands are laid off to “correct course” right after celebrating record-breaking quarters. The justification? Efficiency, restructuring, “right-sizing.” But when job cuts follow growth, it’s not about survival. It’s about optics.
Worse still is the rise of the ghost workforce — temps, contractors, and offshore workers who operate in the shadows of full-time employees. These workers are brought on in droves, often through staffing agencies, with none of the protections, benefits, or stability. They are essential to day-to-day operations but invisible when it comes to company reports, press releases, and DEI dashboards.
Companies scale up using armies of contingent labor, then quietly cut them loose at will. No severance. No PR announcement. No headline.
💬 “You know you're being used when you're indispensable but not included.”
This practice creates a two-tiered system that maximizes flexibility for companies while keeping real human beings in a permanent state of precarity. It’s not just irresponsible — it’s exploitative. And it's time we stopped pretending it's normal.
If there’s one thing that triggers frustration across LinkedIn, it’s this: submitting dozens of job applications only to be ghosted — not by people, but by algorithms.
AI is everywhere now, especially in hiring. Applicant Tracking Systems (ATS) filter resumes based on keywords, auto-reject candidates before a human ever sees the name, and reduce people to Boolean strings. It’s sold as efficiency, but it often feels like exclusion at scale.
🙅♂️ Didn’t use the exact phrasing from the job description? Rejected. 📄 Took a career break? Rejected. 🧠 Career changer with transferable skills? Rejected.
We’re told to “optimize for the machine,” but what about optimizing for human potential?
These systems aren’t neutral. They reflect the biases of the data they were trained on — and the assumptions of the people who built them. And when companies rely on AI to make decisions in hiring, policing, healthcare, and finance without oversight, they’re not eliminating bias — they’re institutionalizing it.
⚠️ “The real risk isn’t that computers will become more like humans, it’s that humans will become more like computers.” — Sherry Turkle
This quote is widely attributed to Sherry Turkle, an MIT professor and expert on the psychology of human-technology interaction. While the exact wording may be paraphrased, it echoes the central warning across her work — especially in The Second Self — that when we rely too heavily on machines, we begin to reshape ourselves in their image. Our conversations become more transactional. Our empathy thins. Our decision-making mimics logic stripped of context.
I’ve seen candidates with incredible skills slip through the cracks because they didn’t “score” right. I’ve seen teams miss out on diverse talent because they trusted the tool more than their own judgment. And I’ve seen companies hide behind the tech to avoid accountability.
AI isn’t the problem. The lack of guardrails is. And until we put ethics before efficiency, we’ll keep using automation to make decisions we should never have outsourced in the first place.
Some tech leaders think regulation is optional, ethics are for later, and any criticism is a threat to innovation. Whether it’s cryptocurrency schemes, questionable biotech ventures, or billionaire CEOs melting down in real time, we’ve allowed the myth of the “visionary founder” to excuse a lot of megalomania.
But lately, this God-complex isn’t just metaphorical — it’s literal.
🚨 In early 2024, Elon Musk warned that we are on the verge of creating a “digital god,” referring to AI systems so powerful and unregulated they could reshape human existence. He called for urgent regulatory intervention, pointing out that we’re racing ahead without understanding the consequences.
Meanwhile, Meta CEO Mark Zuckerberg fired back at rivals, accusing them of attempting to build a monolithic, all-knowing AI, saying bluntly: “It sounds like they’re trying to create God.” Instead, he positioned Meta’s approach as more modular and grounded — though skeptics aren’t exactly reassured, given Meta’s own history with data misuse and algorithmic controversy.
These public sparring matches aren’t just PR — they’re a glimpse into a dangerous dynamic: tech titans jockeying for control over systems that could dominate how we work, learn, vote, and even think.
And while they debate the shape of the future, the rest of us are left without a seat at the table.
Leadership in tech should demand more than product-market fit. It should demand responsibility, humility, and a grounding in reality. Because the moment we let founders play God is the moment we lose sight of who they're really building for.
I don’t believe tech needs more disruption — I believe it needs more accountability. And I’ve seen firsthand how easily that gets lost in the race to scale, impress investors, or ship the next shiny feature.
If we want to build systems worth trusting, we need leaders who take responsibility as seriously as innovation. Because if we don’t act like we deserve the license to build the future, we shouldn’t be surprised when people start to take it away.
#TechEthics #ResponsibleTech #AI #DataPrivacy #GhostWorkforce #StartupCulture #SurveillanceCapitalism #LeadershipMattersv
As I delve into the fascinating realms of technology and science for our newsletter, I can't help but acknowledge the crucial role of seamless IT networks, efficient desktop environments, and effective cloud systems. This brings to light an important aspect of my work that I am proud to share with you all. Besides curating engaging content, I personally offer a range of IT services tailored to your unique needs. Be it solid desktop support, robust network solutions, or skilled cloud administration, I'm here to ensure you conquer your technological challenges with ease and confidence. My expertise is yours to command. Contact me at michael@conceptualtech.com.
Tech Topics is a newsletter with a focus on contemporary challenges and innovations in the workplace and the broader world of technology. Produced by Boston-based Conceptual Technology (http://www.conceptualtech.com), the articles explore various aspects of professional life, including workplace dynamics, evolving technological trends, job satisfaction, diversity and discrimination issues, and cybersecurity challenges. These themes reflect a keen interest in understanding and navigating the complexities of modern work environments and the ever-changing landscape of technology.
Tech Topics offers a multi-faceted view of the challenges and opportunities at the intersection of technology, work, and life. It prompts readers to think critically about how they interact with technology, both as professionals and as individuals. The publication encourages a holistic approach to understanding these challenges, emphasizing the need for balance, inclusivity, and sustainability in our rapidly changing world. As we navigate this landscape, the insights provided by these articles can serve as valuable guides in our quest to harmonize technology with the human experience.