Credit: MichaelJayBerlin/ShutterstockApple this week released updates to macOS and iOS to address the critical FaceTime security flaw, which allowed callers to secretly activate a target device's microphone and was disclosed in late January. But what was concerning about the flaw was not just the security risks, but implications it highlighted regarding how important vulnerabilities can be reported to Apple.
Group FaceTime calls debuted alongside iOS 12 in late 2018. The feature expanded the capabilities of Apple's popular video calling service but also introduced security problems. The flaw was as dire as it sounds. Someone could initiate a FaceTime call, activate their intended target's microphone and then listen in without that person's knowledge. Worse still was the fact that pressing the iPhone's side button to decline the call would actually turn on the front-facing camera.
Apple initially said it would release updates to macOS and iOS addressing the flaw within a few days of its public disclosure. Following public backlash, though, it disabled group FaceTime calls on the server side on January 28. It released the operating system updates on February 7.
But Apple wasn't only criticized for its initial plan of leaving a critical security vulnerability in a popular communications platform alone for several days. The company was also chided after the revelation that a teenager and his mom were forced to jump through hoops to disclose the flaw.
A quick note about reporting bugs in Apple products: sometimes one can just email customer support. Most of the time, however, they have to create a "radar." Ordinary people can't create new radars; they have to sign up for a developer account to do that. Then, it's up to Apple to determine the radar's handling. This runs counter to how most tech companies handle bug reports, especially those involving vulnerabilities in their products. Most will make it easy to disclose the problem and then pay the person who discovered it based on the severity of the flaw. They don't ask them to sign up for dev programs.
You can probably see where this is going: the mother-son duo who discovered this flaw attempted to inform Apple's customer support. They were told to sign up for a developer account, file a radar and then submit as many details as possible about the issue. In the meantime, the flaw would remain.
One security researcher protested Apple's decision not to pay the people who disclosed the FaceTime flaw by refusing to share information about a macOS vulnerability that could allow attackers to steal ostensibly secure passwords.
There has been some good news on that front, though, which is that Apple does intend to pay the kid who discovered the flaw (no word on how large a cut mom's gonna get). According to The Verge, it's also "providing an additional gift to fund Grant Thompson’s tuition." The company hasn't revealed how much it plans to pay.
Now the questions are whether or not this will lead to structural change at Apple and how other people who attempted to disclose the same flaw will be compensated. At least one other person has come forward to say they disclosed the flaw to Apple before it was publicized. How will that be handled?
And how can Apple expect people to report security flaws, which have potential negative repurcussions to its dedicated users, if getting the flaws acknowledged requires jumping through hoops and signing up for a developer program, which you very well may have no interest in?
Apple handled this flaw better than other companies might have. It was able to disable group FaceTime on the server side and bundled a more permanent solution with operating system updates released just a few days later. But the disclosure process--and the company's initial plan to just leave the flaw alone for a few days--raise questions about how the company might have resolved the problem better.
The operating system updates that address this flaw--and others--are available now. You can find instructions for updating your macOS or iOS device on Apple's support website.