Intel’s Strategy for the Post-PC World Begins to Take Real Shape

, , Comments Off on Intel’s Strategy for the Post-PC World Begins to Take Real Shape

Intel's Strategy for the Post-PC World Begins to Take Real Shape

HIGHLIGHTS

  • Intel made a number of strategy and product announcements at IDF 2016
  • The amount of data each person creates each day is expected to explode
  • PCs aren’t dead, but many other devices will become intertwined with our lives

After a less-than-stellar year in which it announced 12,000 layoffs, a major deviation from its CPU roadmap, and a complete withdrawal from the smartphone SoC race, Intel’s vision for a connected, compute-intensive post-PC future is finally beginning to take shape. At the company’ annual developer conference in San Francisco this week, attention was diverted away from conventional devices such as PCs and laptops, and lavished on other perceived areas of growth.

Intel will spend the next year trying to build strong foundations for emerging categories such as virtual reality, drones, autonomous vehicles, and domestic robots as well as custom “smart” devices for increasingly specific niches, which it hopes will become part of the fabric of our everyday lives. Simultaneously, it wants to cement its position in the data centre where it expects to see the massive growth as sensor-laden devices generating massive amounts of data, to be interpreted and made use of. That involves not only processing all that data, but moving it from endpoint devices to the cloud and back to the user quickly enough to be useful.

One of the biggest things to come into focus for Intel this year was its RealSense 3D camera initiative. First promoted as a “perceptual computing” enhancement for PCs, tablets and laptops, RealSense seemed like a gimmick without many useful real-world applications. It was compared to Microsoft’s Kinect – amusing in specific scenarios, but largely unnecessary. Intel suggested that it could replace traditional input devices, making computers more human, but there weren’t a lot of device manufacturers who tried to make that happen. Over the past year, this initiative has pivoted away from the paradigm of manipulating a UI and has really come into its own, powering security enhancements such as Windows Hello, enabling drones to understand their environments and avoid collisions, and powering all kinds of interactive robots.Of course there was the big news about Project Alloy, an ecosystem designed to enable “mixed reality” experiences, which Intel defines as similar to virtual reality but incorporating elements of the real world rather than trying to replace them. Motion tracking is handled by RealSense – the headset is opaque and all real-world objects are captured and fed into the virtual environment, unlike augmented reality which is overlaid against what you see through your own eyes. Then there was Joule, a tiny new development platform designed specifically for RealSense applications; Euclid, a ready-made PC of sorts with RealSense integrated into its candybar-sized body.

intel_joule.jpg

In the near future, a new, thinner RealSense module with increased range and sensitivity will become available. It will be easier to integrate into consumer products, and has a real chance of going mainstream. Demos on the show floor included a robotic companion for diabetic children, safety goggles which can detect if engineers use the wrong parts when working on sensitive equipment, a projector that can turn any tabletop into an interactive game, a heads-up display that bikers can wear with their helmets, and an educational modular robot-building kit.

While some of these projects are frivolous and some might never make it out of the proof-of-concept stage, the common thread was that no matter how small the niche, if there’s a need, Intel wants to fill it. A small number of these projects might wind up resonating with people, or at least planting the seed of an idea in the minds of other attendees.

No matter what, we are going to have more devices in our lives, or at least devices that do more things. That’s in addition to the appliances and environmental controls around us, our clothes, our cars, the tools we use at work, smart city infrastructure that we walk past, and more. All of them will be generating data, and a lot of that data will be parsed using artificial intelligence of some sort. Intel wants to be in that space as well, not just processing it all with a new generation of Xeon Phis but slinging it across the world using incredibly high-speed, low-latency silicon photonics between servers and 5G to the endpoints where it’s needed.

intel_project_alloy.jpg

We seem to finally be at a point where IoT ceases to be a buzzword and becomes tangible and relatable. It all comes together – RealSense can be used for object and pattern recognition, but that only works when there are massive data sets to learn from. Data is generated by the cameras, sent to a huge data centre, processed by artificial intelligence, and sent back in a useful form. Some applications don’t need that to happen quickly, but when we get to things like autonomous vehicles, medical robots or even public utilities, even milliseconds matter. Intel wants us to know that it’s working on the entire chain.

Whereas last year’s IDF was flashy but somewhat lacking in direction, attendees of this year’s show were able to come away with a sense of the company’s direction – and it isn’t about PCs. While not quite in the rear-view mirror, we can expect PC hardware to decline in overall importance as a much, much bigger picture of connected devices and services emerges around us.

Disclosure: The correspondent’s flights and hotel for IDF were sponsored by Intel.

Tags: IDF, Intel, Intel 2016
[“source-smallbiztrends”]