Jobs by JobLookup

Windows 365 Link is a $349 mini PC that streams Windows from the cloud / Microsoft has created its own purpose-built device for the cloud-based version of Windows.

 


The CrowdStrike catastrophe that took down 8.5 million Windows PCs and servers in July has left many of Microsoft’s biggest customers looking for answers to make sure that such an event never happens again. Now, Microsoft has some answers in the form of a new Windows Resiliency Initiative that’s designed to improve Windows security and reliability.

The Windows Resiliency Initiative includes core changes to Windows that will make it easier for Microsoft’s customers to recover Windows-based machines if there’s ever another CrowdStrike-like incident. There are also some new Windows platform improvements to provide stronger controls over what apps and drivers are allowed to run and to help allow antivirus processing outside of kernel mode.

Microsoft has developed a new Quick Machine Recovery feature in light of the CrowdStrike incident that will enable IT admins to target fixes at machines remotely even when they’re unable to boot properly. Quick Machine Recovery leverages improvements to the Windows Recovery Environment (Windows RE).

“In a future event, hopefully, that never happens, we could push out [an update] from Windows Update to this Recovery Environment that says delete this file for everyone,” explains David Weston, vice president of enterprise and OS security at Microsoft, in an interview with The Verge. “If there’s one central problem that we need to push to a lot of customers, this gives us the ability to do that from Windows RE.”

Weston has talked to hundreds of customers since the Crowdstrike debacle, and they’re all asking for better recovery tools, improved deployment practices from security vendors, and improved resiliency from Windows itself to ensure the events that transpired in July never repeat themselves.

“Every one of them is saying I owe my board a response on how this doesn’t happen again,” says Weston. Microsoft is now requiring that security vendors that are part of the Microsoft Virus Initiative (MVI) take specific steps to improve security and reliability. These steps include better testing and response processes, alongside safe deployment practices for updates to Windows PCs and servers — including gradual rollouts and monitoring and recovery procedures.

Microsoft has also been working with its MVI partners to enable antivirus processing outside of the kernel. CrowdStrike’s software runs at the kernel level of Windows — the core part of an operating system that has unrestricted access to system memory and hardware. This deep kernel access allowed a faulty update to generate a Blue Screen of Death as soon as affected systems started up.

“We’re developing a framework that [security vendors] want to use and they’re incentivized to use, now it has to be good enough to fill their use case,” explains Weston. Microsoft is now developing this new framework, and a preview of it will be available in private to Windows security partners in July 2025.

“It’s a significant technical challenge to centralize this and meet everyone’s requirements, but we have really experienced people across endpoint detection and the kernel space,” says Weston. At Microsoft’s Windows Endpoint Security Ecosystem Summit in September, the company had kernel architects from the Windows team in attendance to talk directly to security vendors like CrowdStrike about moving scanning outside of the kernel.

Ultimately, it’s up to Microsoft to secure Windows down further and to provide a framework that works well for security vendors, too. “We sort of control physics here. We can change the memory manager or the driver framework, and we don’t have to abide by the rules that a third-party developer would,” says Weston. “That’s why I’m bullish on our ability to execute here.”

The administrator improvements coming to Windows 11.
The administrator improvements coming to Windows 11.
 Image: Microsoft

Alongside the resiliency improvements, Windows 11 is also getting administrator protection soon. It’s a new feature that lets users have the security of a standard user but with the ability to make system changes and even install apps when needed. Administrator protection temporarily grants admin rights for a specific task once a user has authenticated using Windows Hello and then removes them straight after a system change is made or an app is installed. “Windows creates a temporary isolated admin token to get the job done. This temporary token is immediately destroyed once the task is complete, ensuring that admin privileges do not persist,” says Weston.

The White House has been encouraging developers to use memory-safe programming languages like Rust, and Microsoft is making changes to Windows, too. It’s “gradually moving functionality from C++ implementation to Rust” in Windows, to help further improve the security of the OS.

Prefer to offload all your Windows tasks to the cloud? Microsoft may just have the compact, desk-bound computer for you.

On Tuesday at Microsoft Ignite 2024, the tech giant unveiled Windows 365 Link, a fanless, lightweight PC that connects to Windows 365. Windows 365 is a cloud-hosted, virtual Windows machine — like a typical Windows installation, but running on a remote server.

Set to come to “select markets” in April of next year for the suggested retail price of $349, Windows 365 Link boots quickly (“in seconds,” Microsoft says), “instantly” wakes from sleep, and sports several display and peripheral ports. You’ll find support for dual 4K monitors, four USB ports, audio and Ethernet jacks, and the standard array of wireless connectivity (Wi-Fi 6E and Bluetooth 5.3).

Now, Windows 365 Link isn’t a proper PC. While it can handle things like Microsoft Teams meetings and Webex sessions, the device lacks storage and can’t install local applications. Moreover, its small, Windows-based operating system is extremely locked down. The security features and passwordless login can’t be disabled, Microsoft says.

Windows 365 Link
Windows 365 Link in all its port glory.Image Credits: Microsoft

Of course, Microsoft’s not targeting consumers with Windows 365 Link — enterprises are the play, here. The company notes that businesses can manage (and wipe) Windows 365 Link units remotely, and configure them to automatically check for, download, and install updates.

Microsoft is touting the environmental-friendliness of Windows 365 Link, which it claims has a lower energy consumption than most desktops. According to the company, Windows 365 Link contains 90% recycled aluminum alloy in its top shield, 100% recycled aluminum in its bottom plate, 100% recycled copper, and 96% recycled tin solder in its motherboard.

Orgs in Australia, Canada, Germany, Japan, New Zealand, the U.K., and the U.S. can apply for the Windows 365 Link preview program starting today. With any luck, it’ll avoid the same fate as Microsoft’s last miniature PC.

AI agents are the talk of the enterprise right now. But, business leaders want to hear about tangible results and relevant use cases — as opposed to futuristic, not-quite-there-yet scenarios — and demand tools that are easy to deploy and use and, further, that support their preferred model(s). 

Microsoft claims to have all these concerns covered with new no-code and low-code capabilities in Microsoft 365 Copilot. Today at Microsoft Ignite, the tech giant announced that users can now build their own custom autonomous agents or deploy out-of-the-box, purpose-built agents. And, they can do this via a bring-your-own setup that provides them access to the 1,800-plus models in the Azure AI catalog. (See our separate story today about how Microsoft has quietly assembled the largest AI agent ecosystem — and no one else is close).

“Companies have done a lot of AI exploration and really want to be able to measure and understand how agents can help them be more efficient, improve performance, and decrease cost and risk,” Lili Cheng, corporate VP of the Microsoft AI and research division, told VentureBeat. “They’re really leaning into scaling out their copilots.”

Supporting bring-your-own-knowledge, bring-your-own-model

According to IDC, in the next 24 months, more and more companies will build custom, tailored AI tools. Indeed, vendors — from tech giants such as Salesforce and Snowflake to smaller players like CrewAI and Sema4.ai — are increasingly pushing platforms to market that promise to revolutionize enterprise operations. 

Microsoft introduced Copilot in February 2023 and has now infused it with a suite of new capabilities to support agentic AI. Autonomous capabilities now in public preview allow users to build agents that act on their behalf without additional prompting. This means agents can work and act in the background without human oversight. 

Users can use templates for common scenarios (such as sales orders and deal accelerator agents) in Copilot Studio. Or, more advanced developers can take advantage of a new Agent SDK (now available in preview) to build full-stack, multichannel agents that integrate with various Microsoft services and can be deployed across Microsoft, third-party, and web channels. 

New integrations with Azure AI Foundry will support bring-your-own-knowledge (custom search indices can be added as a knowledge source) (now in preview) and bring-your-own-model (now in private preview). This will allow users to pull from the 1,800-some-odd models (and counting) in Azure’s catalog. 

This element is critical, as users are demanding the ability to securely use proprietary data and combine and test different models without getting locked into one or the other. “People want a variety of models, they want to be able to fine-tune models,” said Cheng. 

Ready-made agents for HR, translation, project management

However, not all tasks require a custom solution; already-built models can be useful across enterprises. Microsoft is releasing several ready-made agents in Copilot that can handle simple, repetitive tasks or more complex multi-step processes. These include: 

  • Agents in SharePoint, which allows users to create their own tailored agents that they can give names and personalize. Users can ask questions receive real-time answers and share agents across emails, meetings, and chats. Microsoft emphasizes that agents follow existing SharePoint user permissions and sensitivity labels to help ensure that sensitive information isn’t overshared.
  • Employee self-service agent, which answers common workplace policy-related questions and takes action on HR and IT-related tasks. For instance, employees can retrieve benefits and payroll information, request a new device, or start a leave of absence form. 
  • Facilitator agent, which takes real-time notes in Teams and chats and provides a summary of important information as the conversation is unfolding. 
  • Interpreter agent, which provides real-time translation in team meetings in up to nine languages. Participants can also have the Interpreter simulate their voice.
  • Project Manager agent, which automates processes in Planner, handling projects from creation to execution. The agent can automatically create new plans from scratch or use templates; it then assigns tasks, tracks progress, sends notifications, and provides status reports. 

Further, a new Azure AI Foundry SDK offers a simplified coding experience and toolchain for developers to customize, test, deploy, and manage agents. Users can choose from 25 pre-built templates, integrate Azure AI into their apps, and access common tools including GitHub or Copilot Studio. 

Cheng pointed to the importance of low-code and no-code tools, as enterprises want to accommodate teams with a range of skills. “Most companies don’t have big AI teams or even development teams,” she said. “They want more people to be able to author their copilots.”

The goal is to greatly simplify the agent-building process so that enterprises “build something once and use it wherever their customers are,” she said. Tooling should be simple and easy to use so that app creators don’t even know if things are getting ever more complicated on the back end. Cheng posited: “Something might be more difficult, but you don’t know it’s more difficult, you just want to get your job done.”

McKinsey, Thomson Reuters use cases

Initial use cases have revolved around support, such as managing IT help desks, as well as HR scenarios including onboarding, said Cheng. 

McKinsey & Company, for its part, is working with Microsoft on an agent that will speed up client onboarding. A pilot showed that lead time could be reduced by 90% and administrative work by 30%. The agent can identify expert capabilities and staffing teams and serve as a platform for colleagues to ask questions and request follow-ups. 

Meanwhile, Thomson Reuters built an agent to help make the legal due diligence process — which requires significant expertise and specialized content — more efficient. The platform combines knowledge, skills, and advanced reasoning from the firm’s gen AI tool CoCounsel to help lawyers close deals more quickly and efficiently. Early tests indicate that several tasks in these workflows could be cut by at least 50%. 

“We really see people combining more traditional copilots — where you have AI augmenting people skills and providing personal assistance — together with autonomous systems,” said Cheng. Agents are increasingly authoring processes and workflows and working across groups of people and in multi-agent systems, she noted. 

AI agents aren’t new (but using them on top of LLMs is)

While they may be all the talk now, agents are not new,  Microsoft Source writer Susanna Ray emphasizes in a blog post out today. “They’re getting more attention now because recent advances in large language models (LLMs) help anyone — even outside the developer community — communicate with AI,”  she writes. 

Agents serve as a layer on top of LLMs, observing and collecting information and providing input so that together they can generate recommendations for humans or, if permitted, act on their own. “That agent-LLM duo makes AI tools more tangibly useful,” Ray notes, adding that agents will become even more useful and autonomous with ongoing innovations in memory, entitlements, and tools.

Cheng pointed out that Microsoft began talking about conversational AI about eight years ago. Before AI agents, conversation data “was always kind of lost and siloed.” Now, agentic AI can bring intelligence to users and provide context in real-time. 

“People just want that tooling to be more natural,” she said. “It’s phenomenal that we can do a lot of these things that we dreamed about. Being able to combine all these sources effortlessly is really groundbreaking.”

Microsoft Teams meetings are getting a new interpreter feature that lets each participant speak or listen in the language of their choosing. Interpreter in Teams uses real-time AI-powered speech-to-speech translation to simulate your speaking voice during meetings.

A preview will be available in early 2025 that will include up to nine languages and the ability for the interpreter feature to simulate your personal voice in a different language.

It’s part of a series of AI-powered changes coming to Microsoft Teams. Meeting transcription will soon support multilingual meetings so that up to 31 translation languages will be supported for a meeting transcript. 

The Microsoft Teams Super Resolution option.
The Microsoft Teams Super Resolution option.
 Image: Microsoft

In early 2025, Microsoft will also preview the ability for Teams to understand and recap any visual content that was shared onscreen from PowerPoint or the web during meetings, alongside the usual transcript and chat summaries. Copilot will also be able to perform a quick summary of any files that have been shared in the chat interface in Teams, so you don’t have to open the entire file.

Microsoft is also using Copilot Plus PCs to enable a Teams Super Resolution feature that leverages the local NPU chip to enhance the quality of video calls. It could upscale colleagues when you’re dialing in through a weak internet connection. Windows app developers will also be able to use similar image super-resolution APIs to enhance blurry images in January, alongside Copilot Runtime updates like image segmentation, object erase, and image description features.

If Windows is mixed reality sounds like your idea of a good time, rejoice. Microsoft said on Tuesday at Microsoft Ignite 2024 that it’s bringing the “full capabilities” of Windows 11 to the Meta Quest 3 and Quest 3S in December as part of a public preview.

“Full capabilities,” in this context, means that you’ll be able to access a local Windows PC or cloud instance of Windows (via Windows 365) from a Quest headset. Microsoft says it only takes “seconds” to connect, and likens the experience to a “private, high-quality, large, multi-monitor workstation.” We’ll be the judge of that.

Supported apps in this new Windows modality extend into 3D space — a capability enabled by what Microsoft is calling Volumetric Apps. During a demo in May, Microsoft showed off a digital exploded view of an Xbox controller from the perspective of a Quest 3 wearer — an object that the wearer could manipulate with their hands.

Developers can sign up to receive access to an API that allows them to build plug-ins for new or existing 3D Windows desktop apps.

Microsoft has teased Windows content for Quest as far back as 2022 when it said it would partner with Meta to bring select Windows apps including Microsoft Teams to Meta headsets. Last December, Microsoft launched Microsoft 365 productivity features from Word, Excel, and PowerPoint on the Quest, and brought its video game streaming service, Xbox Cloud Gaming, to the hardware.

At the Ignite developer conference today, Microsoft unveiled two new chips designed for its data center infrastructure: the Azure Integrated HSM and the Azure Boost DPU. 

Scheduled for release in the coming months, these custom-designed chips aim to address security and efficiency gaps faced in existing data centers, further optimizing their servers for large-scale AI workloads. The announcement follows the launch of Microsoft’s Maia AI accelerators and Cobalt CPUs, marking another major step in the company’s comprehensive strategy to rethink and optimize every layer of its stack— from silicon to software—to support advanced AI.

The Satya Nadella-led company also detailed new approaches aimed at managing power usage and heat emissions of data centers, as many continue to raise alarms over the environmental impact of data centers running AI.

Just recently, Goldman Sachs published research estimating that advanced AI workloads are poised to drive a 160% increase in data center power demand by 2030, with these facilities consuming 3-4% of global power by the end of the decade.

The new chips

While continuing to use industry-leading hardware from companies like Nvidia and AMD, Microsoft has been pushing the bar with its custom chips.

Last year at Ignite, the company made headlines with Azure Maia AI accelerator, optimized for artificial intelligence tasks and generative AI, as well as Azure Cobalt CPU, an Arm-based processor tailored to run general-purpose compute workloads on the Microsoft Cloud.

Now, as the next step in this journey, it has expanded its custom silicon portfolio with a specific focus on security and efficiency. 

The new in-house security chip, Azure Integrated HSM, comes with a dedicated hardware security module, designed to meet FIPS 140-3 Level 3 security standards.

According to Omar Khan, the vice president for Azure Infrastructure marketing, the module essentially hardens key management to make sure encryption and signing keys stay secure within the bounds of the chip, without compromising performance or increasing latency.

To achieve this, Azure Integrated HSM leverages specialized hardware cryptographic accelerators that enable secure, high-performance cryptographic operations directly within the chip’s physically isolated environment. Unlike traditional HSM architectures that require network round-trips or key extraction, the chip performs encryption, decryption, signing, and verification operations entirely within its dedicated hardware boundary.

While Integrated HSM paves the way for enhanced data protection, Azure Boost DPU (data processing unit) optimizes data centers for highly multiplexed data streams corresponding to millions of network connections, with a focus on power efficiency. 

Azure Boost DPU, Microsoft's new in-house data processing unit chip
Azure Boost DPU, Microsoft’s new in-house data processing unit chip

The offering, first in the category from Microsoft, complements CPUs and GPUs by absorbing multiple components of a traditional server into a single piece of silicon — right from high-speed Ethernet and PCIe interfaces to network and storage engines, data accelerators and security features.

It works with a sophisticated hardware-software co-design, where a custom, lightweight data-flow operating system enables higher performance, lower power consumption, and enhanced efficiency compared to traditional implementations.

Microsoft expects the chip will easily run cloud storage workloads at three times less power and four times the performance compared to existing CPU-based servers.

New approaches to cooling, power optimization

In addition to the new chips, Microsoft also shared advancements made towards improving data center cooling and optimizing their power consumption.

For cooling, the company announced an advanced version of its heat exchanger unit – a liquid cooling ‘sidekick’ rack. It did not share the specific gains promised by the tech but noted that it can be retrofitted into Azure data centers to manage heat emissions from large-scale AI systems using AI accelerators and power-hungry GPUs such as those from Nvidia.

Liquid cooling heat exchanger unit, for efficient cooling of large scale AI systems

On the energy management front, the company said it has collaborated with Meta on a new disaggregated power rack, aimed at enhancing flexibility and scalability.

“Each disaggregated power rack will feature 400-volt DC power that enables up to 35% more AI accelerators in each server rack, enabling dynamic power adjustments to meet the different demands of AI workloads,” Khan wrote in the blog.

Microsoft is open-sourcing the cooling and power rack specifications for the industry through the Open Compute Project. As for the new chips, the company said it plans to install Azure Integrated HSMs in every new data center server starting next year. The timeline for the DPU roll-out, however, remains unclear at this stage.

Microsoft Ignite runs from November 19-22, 2024

Post a Comment

Previous Post Next Post