This morning, Microsoft set the release date for its AI-powered Copilot feature and showed off some of its capabilities for the first time. At a “Responsible AI” panel following the announcement, company executives spoke about the danger of over-reliance on its generative software, which was shown creating blog posts, images, and emails based on user prompts.
- Home
- Technology
- News
Microsoft hopes people won’t become ‘over-reliant’ on its AI assistant
Microsoft executives emphasized the company’s commitment to security and responsible AI at a panel following their Surface and AI launch event in New York City.


Six months after the company laid off the team dedicated to upholding responsible AI principles in the products it shipped, the execs attempted to make a clear statement onstage: Everything is fine. Responsible AI is still a thing at Microsoft. And Copilot isn’t going to take your job.
“The product being called Copilot is really intentional,” said Sarah Bird, who leads responsible AI for foundational AI technologies at Microsoft. “It’s really great at working with you. It’s definitely not great at replacing you.”
Bird referenced a demonstration from the launch event that showed Copilot drafting an email on a user’s behalf. “We want to ensure that people are actually checking that the content of those emails is what they want to say,” Bird said. Panelists mentioned that Bing chat includes citations, which human users can then go back and verify.
“These types of user experience help reduce over-reliance on the system,” Bird said. “They’re using it as a tool, but they’re not relying on it to do everything for them.”
“We want to give people the ability to verify content, just like if you were doing any research,” Divya Kumar, Microsoft’s GM of search and AI marketing, further assured the audience. “The human factor is going to be so important.”
Panelists acknowledged that Copilot (at least, at this stage) will be vulnerable to misinformation and disinformation — including that which other generative AI tools might create. Microsoft has prioritized incorporating tools like citations and Content Credentials (which adds a digital watermark to AI-generated images in Bing) to ensure that people see Copilot’s generations as starting points rather than as replacements for their own work.
Panelists urged the audience not to fear the impact that generative tools might have. “My team and I are taking this really seriously,” said Chitra Gopalakrishnan, Microsoft’s partner director of compliance. “From development to deployment, all of these features go through rigorous ethical analysis, impact analysis, as well as risk mitigation.”
The panelists did, however, acknowledge later on that generative tools might drastically change the landscape of viable careers.
“When you have a powerful tool to partner with, what you need to do is different,” Bird said. “We know some of the jobs are going to change.”
Former MLB speedster Terrance Gore dies at 34
- 4 گھنٹے قبل

Astros' Hader working back from shoulder injury
- 4 گھنٹے قبل

Google is expanding AirDrop support to more Android devices ‘very soon’
- 5 گھنٹے قبل

Govt issues digital license to Kuwait's Raqami Islamic Bank
- 13 گھنٹے قبل
Men's Champ Week 2026: Conference tournament brackets, schedules, auto-bids
- 4 گھنٹے قبل

The MAGA court decision that just supercharged ICE
- 3 گھنٹے قبل

Game consoles built streaming — until it outgrew them
- 5 گھنٹے قبل

We found 20 Verge-approved gifts on sale ahead of Valentine’s Day
- 5 گھنٹے قبل

Substack data breach exposed users’ emails and phone numbers
- 5 گھنٹے قبل
T20 WC: US chasing 191 target against Green Shirts
- 13 گھنٹے قبل

Sources: RHP Martinez, Rays agree to 1-year deal
- 4 گھنٹے قبل

T20 World Cup: Pakistan easily defeats USA by 32 runs
- 12 گھنٹے قبل







