New Webex features aim to create ‘a more inclusive collaboration experience’ for virtual meetings

In a blog post published this week, Webex’s Javed Khan detailed how the company is working to make its popular collaboration software more accessible and inclusive.

“We foster a culture of accessibility-focused thinking and embed universal design principles, backed by comprehensive user engagement and feedback, throughout the development cycle,” Khan wrote. “Understanding how assistive technologies are used is important for developing products and features that work for everyone. At Webex, members of our design, engineering, and product management team are required to complete accessibility training and hands-on training. »

Webex is receiving a slew of usability updates designed for accessibility that, Khan said, “are having a big impact.” Updates include more language support for automated subtitles, new keyboard shortcuts, and more.

Among the notable additions, two highlighted by Khan are improved interpretation and audio intelligence. Enhanced Interpretation allows interpreters (and those who need it) to customize the experience so that it includes both parties. The feature supports up to 110 languages ​​and the user interface distinguishes interpreters from attendees and panelists with a single icon. Performers can choose which direction they face, coordinate handoff with other performers, and more. As for Audio Intelligence, it is designed to “distinguish and enhance the clarity of an individual’s voice” using Webex’s proprietary artificial intelligence technology. Users can choose to optimize their own voice or optimize all voice with a single click or by keyboard shortcut. The feature is being rolled out to “select customers” now, but will be widely available from next month. Webex has a video on YouTube demonstrating the feature.

Comments are closed.