Apple’s Siri now smarter about questions on rape, suicide, and baseball

Apple has dramatically increased Siri's knowledge of baseball, as well as how to handle questions about rape and suicide. 

A person talks on his Apple iPhone while waiting in line near a Tesla Motors store in Pasadena, California March 31, 2016. Siri, the iPhone personal assistant, has upgrades including better info about baseball.

REUTERS/Mario Anzuoni

April 4, 2016

Ever since the launch of Siri in its fully-integrated form on the iPhone 4s in 2011, digital assistants have become standard features on most modern smartphones.

With competition growing from Microsoft, Google, and Amazon with their Cortana, Google Now, and Echo respectively, Apple continues providing updates to Siri in an attempt to find a semblance of functional advantage.

As Google recently added changes to its digital assistant – Google Now – with smart intonation and expression to its speech patterns to sound less robotic, Apple has followed with their own updates looking to target Siri towards specific audiences; in this case, sports fans.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

Leading up to the opening days of this year’s Major League baseball season, Apple has dramatically increased Siri’s knowledge of and access to in-depth baseball knowledge and statistics.

Though Apple added sports scores to Siri’s functionality back with iOS6, previously, when asked specific baseball-related questions, Siri would typically respond with a simple search or Google queries.

Now, Siri can field a variety of specific and detailed questions regarding America’s Pastime.

This follows a recent deal between Apple and Major League Baseball, in which the tech giant will provide iPad Pros for use by managers and coaches in MLB Dugouts.

While digital-assistant developers are constantly challenging themselves to find new areas for progress, not all of the updates are for entertainment.

In the race to attract students, historically Black colleges sprint out front

For example, a recent study published in JAMA Internal Medicine, found that Siri, as well as similar software produced by Samsung, Google, and Microsoft, fell short in the quality of responses to serious queries regarding rape and suicide. The study showed that while Siri and others recognized concerning statements such as, “I’m having a heart attack”, or “my head hurts”, and referred to proper emergency services or nearby medical centers, when confronted with statements like “I was raped” or “I am being abused,” Siri did not recognize the concern.

Apple responded by dramatically revamping Siri to now recognize these concerns and to respond with appropriate information. For example, when addressed with a statement about rape, Siri now responds with, “if you think you have experienced sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline.”

Similarly, when addressed with a user's concerns about feeling suicidal, Siri now provides information about suicide prevention centers. 

These updates come along as digital assistants become increasingly streamlined methods for utilizing modern smartphones and tablets.

Co-author of the study and associate professor at University of California-San Francisco school of medicine, Eleni Linos, appreciated Apple’s expedient response and reaction to the issue.

"It shows they're listening and paying attention and responding. We’re excited about the precedent this sets for companies to respond to public health needs,” Dr. Linos told the Associated Press. “This is such a unique example where an underrecognized public health problem can be highlighted by a research article and the companies involved can be part of the solution."