Friday, November 22nd

    Microsoft will soon let you clone your voice for Teams meetings

    img
    Microsoft has unveiled Practitioner in Brigades, a tool for Microsoft brigades that provides real-time voice-to-voice restatement capabilities.

    Microsoft plans to let brigades druggies reduplicate their voice so they can speak to others in different languages during meetings.  On Tuesday, at Microsoft's Ignite 2024 conference, the company unveiled practitioner in brigades, a tool for Microsoft brigades that provides" real- time voice- to- voice" restatement capabilities. Starting in early 2025, people who use brigades for meetings will be suitable to use practitioner to pretend their voice in over to nine languages English, French, German, Italian, Japanese, Korean, Portuguese, Mandarin Chinese, and Spanish. 


    " Imagine being suitable to sound exactly like yourself in another language," Microsoft CMO Jared Spataro wrote in a blog post participated to TechCrunch. “ Translator in brigades provides real- time voice restatement during meetings, and you can choose to have it mimic your voice for a more particular and engaging experience. ” 


     Microsoft handed many specific details about the point, which will only be available to Microsoft 365 subscribers. But it said the tool does n’t store any biometric data, does n’t add emotion beyond what’s “ naturally present ” in the voice, and can be turned off in brigades settings. Practitioner is designed to replicate the speaker’s communication as faithfully as possible without adding hypotheticals or extraneous information, ” a Microsoft prophet told.“ Voice simulation can only be enabled when druggies give concurrence via a announcement during the meeting or by enabling ‘ Voice simulation concurrence’ in settings. ” 


     A number of enterprises have developed tech to digitally mimic voices that sound nicely natural. Meta lately blazoned that it's testing a restatement tool that can automatically restate voices into Instagram Reels, while ElevenLabs offers a robust multilingual speech generation platform. 


     AI restatements tend to be less lexically rich than mortal restatements, and AI translators frequently struggle to directly convey colloquialisms, circumlocutions, and artistic nuances. still, cost reductions are seductive enough for some people to compromise. Depending on the request and request, the sector of natural language treatment technology, including restatement technology, will bring$ 35.1 billion by 2026. still, AI duplicates also raise security enterprises. 

     

     Deepfakes are spreading like campfire on social media, making it hard to distinguish verity from misinformation. Deepfakes featuring President Joe Biden, Taylor Swift and Vice President Kamala Harris have been viewed and participated millions of times this time. Deepfakes have also been used to target individualities, including to pose as cousins. According to the FTC, losses from identity theft swindles exceeded$ 1 billion last time. 


     Just this time, a group of cybercriminals allegedly orchestrated brigades meetings with business directors that proved so satisfying that the targeted company transferred the culprits$ 25 million. before this time, OpenAI decided to hold off on releasing its voice copying technology, Voice Engine, in part due to the pitfalls( and appearances). 


     From what has been revealed so far, practitioner in brigades is a fairly limited use of voice cloning. still, this doesn't mean that the tool can not be misused one could imagine a vicious bushwhacker transferring misleading recordings to an practitioner( for illustration, someone requesting bank account information) in order to gain a restatement into the target language. Hopefully, we'll have a better idea of what security measures Microsoft will add to practitioner in the coming months.