Google invests in startup that’s turning dead PCs into Chromebooks

Neverware, a New York City-based startup with a software platform that converts legacy computers into Chromebooks, announced today that it has raised a new round of investment led by Google. It won’t say the exact amount, but it will likely be more than the $6.5 million it raised in its last round.

The synergy between the two parties is obvious. CloudReady is built on the same open-source technology as Google’s Chrome OS. IT teams from corporations or schools that decide to use Neverware also get to leverage integration with ChromeEnterprise through Google’s cloud-based Admin console, simplifying and unifying remote management of the disparate models in their fleets.

An investment that makes sense.


iOS Control Center: Understanding how the WiFi and Bluetooth toggles work

John Gruber, on the fact that the WiFi and Bluetooth buttons in Control Center no longer act as on/off switches:

This is an interesting feature, but I think it’s going to confuse and anger a lot of people. Until iOS 11, the Wi-Fi and Bluetooth toggles in Control Center worked the way it looked like they worked: they were on/off switches. Now, in iOS 11, they still look like on/off switches, but they act as disconnect switches.

Off the top of my head, I would suggest making them three-way switches: on and connected, on but disconnected, and off.


Apple execs respond to Siri and privacy

“I think it is a false narrative,” said Greg Joswiak, Apple’s VP of product marketing. “It’s true that we like to keep the data as optimized as possible, that’s certainly something that I think a lot of users have come to expect, and they know that we’re treating their privacy maybe different than some others are.”

Joswiak argues that Siri can be every bit as helpful as other assistants without accumulating a lot of personal user data in the cloud, as companies like Facebook and Google are accustomed to doing. “We’re able to deliver a very personalized experience . . . without treating you as a product that keeps your information and sells it to the highest bidder. That’s just not the way we operate.”


Equifax Breach Response Turns Dumpster Fire

Brian Krebs:

I cannot recall a previous data breach in which the breached company’s public outreach and response has been so haphazard and ill-conceived as the one coming right now from big-three credit bureau Equifax, which rather clumsily announced Thursday that an intrusion jeopardized Social security numbers and other information on 143 million Americans.

Bloomberg moved a story yesterday indicating that three top executives at Equifax sold millions of dollars worth of stock during the time between when the company says it discovered the breach and when it notified the public and investors.

Shares of Equifax’s stock on the New York Stock Exchange were down more than 13 percent at time of publication versus yesterday’s price.

The executives reportedly told Bloomberg they didn’t know about the breach when they sold their shares. A law firm in New York has already announced it is investigating potential insider trading claims against Equifax.


New York Times review of Samsung’s Galaxy Note 8: “Poor Biometrics and a Subpar Assistant”

Brian X. Chen, writing for the New York Times:

There is as much to love about the new Samsung Galaxy Note 8 as there is to hate.

Let’s get the bad stuff out of the way. For unlocking the phone, the eye scanner barely works and the fingerprint sensor is in a lousy place. Samsung’s Bixby, which is included, is the most incompetent virtual assistant on the market. And — need I remind you — this phone line has a reputation for gadgets that spontaneously combust.


Some of the biometrics, including the ability to unlock your phone by scanning your face or irises, are so poorly executed that they feel like marketing gimmicks as opposed to actual security features.


The iris scanner shines infrared light in your eyes to identify you and unlock the phone. That sounds futuristic, but when you set up the feature, it is laden with disclaimers from Samsung. The caveats include: Iris scanning might not work well if you are wearing glasses or contact lenses; it might not work in direct sunlight; it might not work if there is dirt on the sensor.


When you set up the face scanner, Samsung displays another disclaimer, including a warning that your phone could be unlocked by “someone or something” that looks like you.

Brian does loves the screen and camera, and the fact that it doesn’t explode like the note 7 (at least there are no reports so far), but that warning…


How Apple Finally Made Siri Sound More Human

The first time Alex Acero saw Her, he watched it like a normal person. The second time, he didn’t watch the movie at all. Acero, the Apple executive in charge of the tech behind Siri, sat there with his eyes closed, listening to how Scarlett Johansson voiced her artificially intelligent character Samantha. He paid attention to how she talked to Theodore Twombly, played by Joaquin Phoenix, and how Twombly talked back. Acero was trying to discern what about Samantha could make someone fall in love without ever seeing her.

When I ask Acero what he learned about why the voice worked so well, he laughs because the answer is so obvious. “It is natural!” he says. “It was not robotic!” This hardly counts as a revelation for Acero. Mostly, it confirmed that his team at Apple has spent the last few years on the right project: making Siri sound more human.

This fall, when iOS 11 hits millions of iPhones and iPads around the world, the new software will give Siri a new voice. It doesn’t include many new features or tell better jokes, but you’ll notice the difference. Siri now takes more pauses in sentences, elongates syllables right before a pause, and the speech lilts up and down as it speaks. The words sound more fluid and Siri speaks more languages, too. It’s nicer to listen to, and to talk to.

Apple spent years re-architecting the technology behind Siri, transforming it from a virtual assistant into the catch-all term for all the artificial intelligence powering your phone. It has relentlessly expanded into new countries and languages (for all its faults, Siri’s by far the most worldly assistant on the market). And slowly at first but more quickly now, Apple has worked to make Siri available anywhere and everywhere. Siri now falls under the control of Craig Federighi, Apple’s head of software, indicating that Siri’s now as important to Apple as iOS.

It’ll still be a while before the tech’s good enough to make you fall in love with your virtual assistant. But Acero and his team think they’ve taken a giant leap forward. And they believe firmly that if they can make Siri sound less like a robot and more like someone you know and trust, they can make Siri great even when it fails. And that, in these early days of AI and voice technology, might be the best-case scenario.


Apple Axes Annual Apple Music Festival in London after 10 Years

Apple has confirmed to MBW that it will no longer be hosting the annual Apple Music Festival at London’s Roundhouse.

The UK event officially became the Apple Music Festival in 2015 as part of a rebranding away from its original name of the iTunes Festival.

The annual show was first held in 2007 – typically running for a month at a time with concerts every night, and tickets going to competition winners.

Artists who played the event over its decade-long run included Adele, Oasis, Mumford & Sons, Paul Simon, Ed Sheeran, Coldplay, Lady Gaga, Arctic Monkeys, Kendrick Lamar, Pharrell Williams, The Weeknd, One Direction and Beck.


However, the cancellation of the festival doesn’t signal a move away from live events by Apple Music completely.

The brand was recently a partner of shows by Haim and Skepta in London and Arcade Fire in Brooklyn and it had a heavy presence at SXSW in Texas earlier this year – where it backed shows from Lana Del Rey, Vince Staples and DJ Khaled.

In addition, Apple Music also sponsored Drake’s 32-date Summer Sixteen Tour in 2016 and it supports regular live sessions from its ‘Up Next’ artists.


Siri Leadership Has Officially Moved From Eddy Cue to Craig Federighi

Joe Rossignol, writing for MacRumors:

Apple has updated its executive leadership page to acknowledge that software engineering chief Craig Federighi now officially oversees development of Siri. The responsibility previously belonged to Apple’s services chief Eddy Cue.


Apple’s leadership page is only now reflecting Federighi’s role as head of Siri, but the transition has been apparent for several months, based on recent interviews and stage appearances at Apple’s keynotes.

David Sparks on the potential of Apple’s ARKit

While I was at WWDC this year I was talking to a friend that happens to know quite a bit about what’s going on at Apple. I was gushing about the improvements to iPad with iOS 11. At some point he interrupted me to explain the biggest game-changer in iOS 11 is not iOS productivity. “It’s AR”, he said. If augmented reality is new to you, it’s a technology that allows you to overlay computer generated bits over photos and video of the real world. Imagine holding up your phone to look at a line of shops with an AR arrow drawn over the screen to show you the most efficient route you can take to find spicy carrots.

ARKit is Apple’s attempt to bring augmented reality to the masses. Historically, the problem with most AR implementations is that it required two distinct sets of skills. First, the app developer had to have a great idea about how to use AR and second, the developer had to be a wizard at building an underlying AR engine. It’s that second part that prevented much exploration on the first part. As a result, there are very few examples of AR on iOS (and even fewer examples of good AR on iOS).

So my well-connected friend told me that we should not underestimate Apple’s ARKit. As explained to me, a group of very smart people spent years building the ARKit API’s that we’re now seeing with iOS 11. ARKit does all the heavy lifting for app developers that want to add an augmented reality system to their app. It effectively democratized AR so any developer with a good idea can tap into all that work for their AR engine with just a few lines of code.

I admitted to my friend back at WWDC that while I thought ARKit was cool, I didn’t really see why it could be such a big deal. In my head, the above spicy carrot example was the beginning and the end of how I’d use AR in my life.


When iOS 11 ships (probably only a matter of weeks from now), augmented reality is, overnight, going to transform from a fringe technology to something installed on hundreds of millions of iOS devices. I think my friend back at WWDC was right. This is going to be a big deal.

ARKit is going to usher in the newest gold rush for app developers. Once iOS 11 ships, there are going to be several developers that do something brilliant with augmented reality and their apps are going to go gangbusters. To me, however, the real interesting part will be after that initial wave, when someone comes up with a great idea for augmented reality that is completely out of the box and changes a little something for everyone. I fully expect that to happen.


Apple announces iPhone event on Sept. 12 in Steve Jobs Theater

Jim Dalrymple:

Apple on Thursday announced an event which will be held at the Steve Jobs Theater located at the company’s new Apple Park campus. The event will take place on September 12 at 10:00 am.

While Apple didn’t specifically say in the invite what would be announced, it is widely expected that the company will unveil a new iPhone at the event. In fact, there could be a couple of new iPhones introduced at the event.

September is usually the time that Apple unveils its new iPhones for the upcoming holiday shopping season, so the timing of this event makes sense for that release.