Karmakar urges engineering graduates to build AI that earns trust


Karmakar urges engineering graduates to build AI that earns trust
Google engineer Aveek Karmakar urges graduates to build AI that earns trust

At a recent webinar addressing engineering graduates, Aveek Karmakar, Staff Software Engineer at Google, delivered a clear and timely message: the future of technology will not be defined by speed alone, but by how well it respects users’ time and earns their trust.Speaking to young engineers preparing to enter a rapidly evolving industry, Karmakar emphasized that artificial intelligence is now becoming the primary engine through which people search for homes, jobs, and financial services. In such a landscape, he noted, technical excellence must go hand in hand with ethical responsibility.“The most effective technology is the kind that prioritizes the user’s time and trust,” he said, urging graduates to think beyond efficiency metrics and focus on building systems that people can rely on.Drawing on his experience across global technology leaders such as Amazon, Zillow, and Google, Karmakar highlighted the shift from keyword-based computing to intent-driven intelligence.He explained how traditional systems required users to adapt to rigid commands, whereas modern AI must adapt to human language and context. Illustrating this, he described the evolution of natural language search systems that allow users to express needs conversationally, such as looking for “a quiet home with a workspace near a park”, without navigating complex filters.For aspiring engineers, the takeaway was clear: the goal is no longer just to process inputs, but to truly understand user intent and deliver meaningful outcomes.A significant portion of Karmakar’s address focused on AI ethics and the responsibility engineers carry in shaping digital systems. He warned that unchecked algorithms can amplify existing societal biases at scale.“If an algorithm inherits the biases of the past, it doesn’t just replicate them—it scales them,” he said, stressing the need for “guardrails by design.”He encouraged students to actively build fairness, transparency, and accountability into the systems they create. Referencing work on compliance-focused AI systems, he pointed out that responsible engineering is not a constraint it is a competitive advantage in a trust-driven digital economy.Karmakar also gave students a glimpse into the future of search and information access. At Google, his current work focuses on multimodal AI—systems that combine text, images, and video to deliver richer and faster results.He explained that the next generation of AI will not just answer questions but will present information in the most intuitive format for the user. This evolution, he noted, will fundamentally change how people interact with technology.“The future of AI isn’t just about technical capability,” he said. “It’s about how effectively we help people find the right information, save time, and maintain their trust.”Closing his address, Karmakar urged graduates to see themselves not just as coders, but as architects of human-centric technology. He emphasized that the real impact of AI lies in empowering people—freeing them from repetitive tasks and enabling them to focus on creativity, decision-making, and problem-solving.His message to the next generation of engineers was direct: build systems that understand people, respect their time, and uphold their trust. In doing so, they will not only advance technology—but define its purpose in society.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *