Mummy Buzz

Dec
30
2014

Facebook 'Year In Review' App Hits Dad Where it Hurts Most

Behind every computer is an accountable programmer

Was 2014 a bad year for you? It was a terrible year for Eric Meyer. Not only did he lose his six-year-old daughter, Rebecca, he was recently reminded of the awful reality by his very insensitive computer. 

Meyer was strangely gracious about the whole thing. He didn't blame Facebook programmers of the 'Year in Review' app for creating an algorithm that simply spits out the major moments in our social media timelines. 

Why Social Media Makes Me Want To Churn Butter In The Woods

Regardless of what Hollywood tells you, artificial intelligence can't distinguish between the emotion behind said major moments, be they heavenly or abjectly hellish.

"For those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year," wrote Meyer on his blog.

In other words, 2014 was a monumental year for Meyer, and one he will never forget. But not for any reason he wants to remember. So when the Facebook app depicted a party scene complete with streamers and balloons surrounding an image of his late daughter's smiling face, it ground the proverbial salt in this dad's wide-open wound.

3 Easy Ways To Look Like A Great Mom On Social Media

Meyer called the gaffe “inadvertent algorithmic cruelty,” and despite receiving a personal apology from the app's product manager, Jonathan Gheller, 'sorry' falls painfully short of excusing the error. After all, computers don't act of their own free will.

Behind every Mac and Microsoft is an accountable programmer—and this one screwed up royally.

"Where the human aspect fell short, at least with Facebook," wrote Meyer, "was in not providing a way to opt out. The Year in Review ad keeps coming up in my feed, rotating through different fun-and-fabulous backgrounds, as if celebrating a death, and there is no obvious way to stop it." 

Before inflicting (I mean, launching) their apps on users, due consideration must be given to 'worst-case' scenarios like Meyer's, and how to avoid them. It's the least humans can do.  Opting out before any emotional images appear, rather than having to view everything before saying "no thanks," may have been a better track to take when rolling out this app, which, to be fair, has been enjoyable for many.