IBM Watson
Operators | IBM |
---|---|
Location | Thomas J. Watson Research Center, New York, USA |
Architecture | 2,880 POWER7 processor threads |
Memory | 16 terabytes of RAM |
Speed | 80 teraFLOPS |
Website | IBM Watson |
IBM Watson is a computer system capable of answering questions posed in natural language.[1] It was developed as a part of IBM's DeepQA project by a research team, led by principal investigator David Ferrucci.[2] Watson was named after IBM's founder and first CEO, industrialist Thomas J. Watson.[3][4]
The computer system was initially developed to answer questions on the popular quiz show Jeopardy![5] and in 2011, the Watson computer system competed on Jeopardy! against champions Brad Rutter and Ken Jennings,[3][6] winning the first-place prize of US$1 million.[7]
In February 2013, IBM announced that Watson's first commercial application would be for utilization management decisions in lung cancer treatment, at Memorial Sloan Kettering Cancer Center, New York City, in conjunction with WellPoint (now Elevance Health).[8]
Description
[edit]
Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[1]
IBM stated that Watson uses "more than 100 different techniques to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."[10]
In recent years, Watson's capabilities have been extended and the way in which Watson works has been changed to take advantage of new deployment models (Watson on IBM Cloud), evolved machine learning capabilities, and optimized hardware available to developers and researchers. [citation needed]
Software
[edit]Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework implementation. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using the Apache Hadoop framework to provide distributed computing.[11][12][13]
Hardware
[edit]The system is workload-optimized, integrating massively parallel POWER7 processors and built on IBM's DeepQA technology,[14] which it uses to generate hypotheses, gather massive evidence, and analyze data.[1] Watson employs a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight-core processor, with four threads per core. In total, the system uses 2,880 POWER7 processor threads and 16 terabytes of RAM.[14]
According to John Rennie, Watson can process 500 gigabytes (the equivalent of a million books) per second.[15] IBM master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about three million dollars.[16] Its Linpack performance stands at 80 TeraFLOPs, which is about half as fast as the cut-off line for the Top 500 Supercomputers list.[17] According to Rennie, all content was stored in Watson's RAM for the Jeopardy game because data stored on hard drives would be too slow to compete with human Jeopardy champions.[15]
Data
[edit]The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles and literary works. Watson also used databases, taxonomies and ontologies including DBPedia, WordNet and Yago.[18] The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material, that it could use to build its knowledge.[19]
Operation
[edit]Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[19] Watson's main innovation was not in the creation of a new algorithm for this operation, but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[19][20] The more algorithms that find the same answer independently, the more likely Watson is to be correct. Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.[19]
Comparison with human players
[edit]
Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human Jeopardy! players.[21] Watson has deficiencies in understanding the context of the clues. Watson can read, analyze, and learn from natural language, which gives it the ability to make human-like decisions.[22] As a result, human players usually generate responses faster than Watson, especially to short clues.[19] Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response.[19] However, Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics, such as jumping between categories on every clue.[19][23]
In a sequence of 20 mock games of Jeopardy!, human participants were able to use the six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[19] During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.[19] Part of the system used to win the Jeopardy! contest was the electronic circuitry that receives the "ready" signal and then examines whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.[24] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy!'s question format.[19] Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM text-to-speech program in 2004.[25]
The Jeopardy! staff used different means to notify Watson and the human players when to buzz,[24] which was critical in many rounds.[23] The humans were notified by a light, which took them tenths of a second to perceive.[26][27] Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds.[28] The humans tried to compensate for the perception delay by anticipating the light,[29] but the variation in the anticipation time was generally too great to fall within Watson's response time.[23] Watson did not attempt to anticipate the notification signal.[27][29]
History
[edit]Development
[edit]Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening's hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch Jeopardy!. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn supported Lickel, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[30] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[31][32][33] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[19]
In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[19] John E. Kelly III succeeded Paul Horn as head of IBM Research in 2007.[34] InformationWeek described Kelly as "the father of Watson" and credited him for encouraging the system to compete against humans on Jeopardy!.[35] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[19] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[36]
During the game, Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage[11] including the full text of the 2011 edition of Wikipedia,[37] but was not connected to the Internet.[38][19] For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble in a few categories, notably those having short clues containing only a few words.[citation needed]
Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California's Information Sciences Institute, the University of Texas at Austin, the Massachusetts Institute of Technology, and the University of Trento,[9] as well as students from New York Medical College.[39] Among the team of IBM programmers who worked on Watson was 2001 Who Wants to Be a Millionaire? top prize winner Ed Toutant, who himself had appeared on Jeopardy! in 1989 (winning one game).[40]
Jeopardy!
[edit]Preparation
[edit]
In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[19][41] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[21] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[21] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would.[42] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all", and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[23][29][43] Stephen Baker, a journalist who recorded Watson's development in his book Final Jeopardy, reported that the conflict between IBM and Jeopardy! became so serious in May 2010 that the competition was almost cancelled.[21] As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy!. Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[19] About 100 test matches were conducted with Watson winning 65% of the games.[44]
To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball crisscrossed by 'threads' of thought—42 threads, to be precise",[45] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' Hitchhiker's Guide to the Galaxy.[45] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 trigger-able states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[46]
A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.[47]
Practice match
[edit]In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[48]
First match
[edit]The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter.[49] Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[50] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[49]
Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings. (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is a leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (the correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[51] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[52] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[53]
Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[52]
However, during the Final Jeopardy! round, Watson was the only contestant to miss the clue in the category U.S. Cities ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????" with five question marks appended indicating a lack of confidence.[52][54][55] Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" did not appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League baseball team.[56] Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[57] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[55] Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. Watson wagered only $947 on the question.[58]
The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[52]
Second match
[edit]During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[59]
In the first round, Jennings was finally able to choose a Daily Double clue,[60] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[61] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[61][62] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[63]
Final outcome
[edit]The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[64] Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.[65]
In acknowledgement of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", paraphrasing a joke from The Simpsons.[66][67] Jennings later wrote an article for Slate, in which he stated:
IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.[45]
Philosophy
[edit]Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think.[68] Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.[69]
Match against members of the United States Congress
[edit]On February 28, 2011, Watson played an untelevised exhibition match of Jeopardy! against members of the United States House of Representatives. In the first round, Rush D. Holt, Jr. (D-NJ, a former Jeopardy! contestant), who was challenging the computer with Bill Cassidy (R-LA, later Senator from Louisiana), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.[70]
IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."[70]
Applications
[edit]After the national press attention gained by the 2011 Jeopardy! appearance, IBM sought out partnerships from education to weather and cancer to retail chatbots in order convince business about Watson's alleged capabilities. This ultimately lead to the failure of Watson to find a profit-making product for the company.[71]
In 2011, the IBM general counsel wrote in The National Law Review arguing that the law profession will become more efficient and better with Watson.[72] After the national attention Jeopardy! afforded them, began an ultimately unsuccessful and expensive project that began when the Memorial Sloan Kettering Cancer Center tried to use Watson to help doctors diagnose and treat cancer patients. Ultimately, the division cost $4 billion to develop but was sold for a quarter of that—$1 billion, in 2022.[73] By 2023, Watson resulted in IBM losing 10% of its stock value, costing four times more than what it brought to the company and resulting in mass layoffs.[71]
From 2012 through the late 2010s, Watson's technology was used to create applications—mostly discontinued[74] to help people make decisions in a variety of areas, among them:
- diagnosing cancer and treatment plans,[75]
- retail shopping,[76]
- medical equipment purchasing,[77]
- cooking and recipes,[78][79]
- water conservation,[80]
- hospitality management,[81]
- human genetic sequencing,[81]
- music development and identification,[82]
- weather forecasting[83]
- to sell ads with weather forecasts,[84]
- to tutor students,[85]
- and tax preparations,[86]
In 2021, technology reporter at The New York Times for Steve Rohr, explained:
The company’s missteps with Watson began with its early emphasis on big and difficult initiatives intended to generate both acclaim and sizable revenue for the company, according to many of the more than a dozen current and former IBM managers and scientists interviewed for this article. Several of those people asked not to be named because they had not been authorized to speak or still had business ties to IBM.
Writing in The Atlantic in 2023, Mac Schwerin argued that IBM's leadership fundamentally did not understand the technology, leading to the hardship and strain caused by the project, saying:
But the suits in charge went after the bigger and more technically challenging game of feeding the machine entirely different types of material. They viewed Watson as a generational meal ticket.
In the end, IBM's initial vision for Watson as a transformative technology capable of revolutionizing industries did not materialize as anticipated.[89] Watson’s capabilities were primarily suited to specific tasks, like natural language processing for trivia games, rather than generalized commercial problem-solving.[90] Watson's mismatch between capabilities and IBM's marketing contributed significantly to Watson’s commercial struggles and eventual decline. The overstated claims about Watson's abilities also caused public sentiment to turn against the idea of Watson and artificial intelligence.[74]
Between 2019 and 2023, IBM shifted focus to a separate initiative WatsonX, distinctly different from Watson, aiming for narrower, industry-targeted technology within IBM's cloud computing and platform-based strategies IBM Watsonx.[74][71]
Healthcare
[edit]IBM's Watson was used to analyze medical datasets to provide physicians with guidance on diagnoses and cancer treatment decisions.[91][92] When a physician submitted a query to Watson, the system started a multi-step process by parsing the input to identify key information, examining patient data to uncover relevant medical and hereditary history, and finally compare various data sources to form and test hypotheses.[93][92]
IBM claimed that Watson's could draw from a wide range of sources, including treatment guidelines, electronic medical records, and research materials.[92] Although, company executives would later blame the lack of data on the projects ultimate failure.[73]
Notably, Watson has not been involved in the actual diagnosis process, but rather assists doctors in identifying suitable treatment options for patients who have already been diagnosed.[94]In fact, a study of 1,000 challenging patient cases found that Watson's recommendations matched those of human doctors in an impressive 99% of cases.[95]
IBM established partnerships with the Cleveland Clinic,[96] the MD Anderson Cancer Center, and Memorial Sloan-Kettering Cancer Center to further its mission in healthcare. In 2011, IBM entered into a research partnership with Nuance Communications and physicians at the University of Maryland and Harvard to develop a commercial product using Watson's clinical decision support capabilities. IBM partnered with WellPoint (now Anthem) in 2011 to utilize Watson in suggesting treatment options to physicians,[97] and in 2013, Watson was deployed in its first commercial application for utilization management decisions in lung cancer treatment at Memorial Sloan-Kettering Cancer Center.[8] The Cleveland Clinic collaboration aimed to enhance Watson's health expertise and support medical professionals in treating patients more effectively. However, the MD Anderson Cancer Center pilot program, initiated in 2013, ultimately failed to meet its goals and was discontinued after $65 million in investment.[98][99][96]
In 2016, IBM launched "IBM Watson for Oncology," a product designed to provide personalized, evidence-based cancer care options to physicians and patients.[89] This initiative marked a significant milestone in the adoption of Watson's technology in the healthcare industry. Additionally, IBM partnered with Manipal Hospitals in India to offer Watson's expertise to patients online.[100][101]
The company ultimately faced challenges in the healthcare market, with no profit and increased competition.[89] In 2022, IBM announced the sell-off of its Watson Health unit to Francisco Partners, marking a significant shift in the company's approach to the healthcare industry.[89][73]
IBM Watson Group
[edit]On January 9, 2014, IBM announced it was creating a business unit around Watson, led by senior vice president Michael Rhodin.[102] IBM Watson Group will have headquarters in New York City's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Engagement Advisor, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing, and biotechnology, Watson Engagement Advisor will focus on self-service applications using insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights based on federated search more easily.[102] The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over—a 2,300 percent improvement in performance and its physical size shrank by 90 percent—from the size of a master bedroom to three stacked pizza boxes.[102] IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years.[103] In 2017, IBM and MIT established a new joint research venture in artificial intelligence. IBM invested $240 million to create the MIT–IBM Watson AI Lab in partnership with MIT, which brings together researchers in academia and industry to advance AI research, with projects ranging from computer vision and NLP to devising new ways to ensure that AI systems are fair, reliable and secure.[104] In March 2018, IBM's CEO Ginni Rometty proposed "Watson's Law," the "use of and application of business, smart cities, consumer applications and life in general."[105]
See also
[edit]- Artificial intelligence
- Blue Gene
- IBM Watsonx
- Commonsense knowledge (artificial intelligence)
- Glossary of artificial intelligence
- Artificial general intelligence
- Tech companies in the New York metropolitan area
- Wolfram Alpha
References
[edit]- ^ a b c "DeepQA Project: FAQ". IBM. Archived from the original on June 29, 2011. Retrieved February 11, 2011.
- ^ Ferrucci, David; Levas, Anthony; Bagchi, Sugato; Gondek, David; Mueller, Erik T. (2013-06-01). "Watson: Beyond Jeopardy!". Artificial Intelligence. 199: 93–105. doi:10.1016/j.artint.2012.06.009.
- ^ a b Hale, Mike (February 8, 2011). "Actors and Their Roles for $300, HAL? HAL!". The New York Times. Retrieved February 11, 2011.
- ^ "The DeepQA Project". IBM Research. Archived from the original on June 29, 2011. Retrieved February 18, 2011.
- ^ "Dave Ferrucci at Computer History Museum – How It All Began and What's Next". IBM Research. December 1, 2011. Archived from the original on March 13, 2012. Retrieved February 11, 2012.
- ^ Loftus, Jack (April 26, 2009). "IBM Prepping 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Archived from the original on July 31, 2017. Retrieved September 18, 2017.
- ^ "IBM's "Watson" Computing System to Challenge All Time Henry Lambert Jeopardy! Champions". Sony Pictures Television. December 14, 2010. Archived from the original on June 16, 2013.
- ^ a b Upbin, Bruce (February 8, 2013). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes. Archived from the original on September 18, 2017. Retrieved September 18, 2017.
- ^ a b Ferrucci, D.; et al. (2010). "Building Watson: An Overview of the DeepQA Project". AI Magazine. 31 (3): 59–79. doi:10.1609/aimag.v31i3.2303. Archived from the original on December 28, 2017. Retrieved February 19, 2011.
- ^ "Watson, A System Designed for Answers: The Future of Workload Optimized Systems Design". IBM Systems and Technology. February 2011. p. 3. Archived from the original on March 4, 2016. Retrieved September 9, 2015.
- ^ a b Jackson, Joab (February 17, 2011). "IBM Watson Vanquishes Human Jeopardy Foes". PC World. IDG News. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
- ^ Takahashi, Dean (February 17, 2011). "IBM researcher explains what Watson gets right and wrong". VentureBeat. Archived from the original on February 18, 2011. Retrieved February 18, 2011.
- ^ Novell (February 2, 2011). "Watson Supercomputer to Compete on 'Jeopardy!' – Powered by SUSE Linux Enterprise Server on IBM POWER7". The Wall Street Journal. Archived from the original on April 21, 2011. Retrieved February 21, 2011.
- ^ a b "Is Watson the smartest machine on earth?". Computer Science and Electrical Engineering Department. University of Maryland, Baltimore County. February 10, 2011. Archived from the original on September 27, 2011. Retrieved February 11, 2011.
- ^ a b Rennie, John (February 14, 2011). "How IBM's Watson Computer Excels at Jeopardy!". PLoS blogs. Archived from the original on February 22, 2011. Retrieved February 19, 2011.
- ^ Lucas, Mearian (February 21, 2011). "Can anyone afford an IBM Watson supercomputer? (Yes)". Computerworld. Archived from the original on December 12, 2013. Retrieved February 21, 2011.
- ^ "Top500 List – November 2013". Top500.org. Archived from the original on 2013-12-31. Retrieved 2014-01-04.
- ^ Ferrucci, David; et al. "The AI Behind Watson – The Technical Article". AI Magazine (Fall 2010). Archived from the original on November 6, 2020. Retrieved November 11, 2013.
- ^ a b c d e f g h i j k l m n o p Thompson, Clive (June 16, 2010). "Smarter Than You Think: What Is I.B.M.'s Watson?". The New York Times Magazine. Archived from the original on June 5, 2011. Retrieved February 18, 2011.
- ^ "Will Watson Win On Jeopardy!?". Nova ScienceNOW. Public Broadcasting Service. January 20, 2011. Archived from the original on April 14, 2011. Retrieved January 27, 2011.
- ^ a b c d Needleman, Rafe (February 18, 2011). "Reporters' Roundtable: Debating the robobrains". CNET. Retrieved February 18, 2011.[dead link ]
- ^ Russo-Spena, Tiziana; Mele, Cristina; Marzullo, Marialuisa (2018). "Practising Value Innovation through Artificial Intelligence: The IBM Watson Case". Journal of Creating Value. 5 (1): 11–24. doi:10.1177/2394964318805839. ISSN 2394-9643. S2CID 56759835.
- ^ a b c d "Jeopardy! Champ Ken Jennings". The Washington Post. February 15, 2011. Archived from the original on February 14, 2011. Retrieved February 15, 2011.
- ^ a b Gondek, David (January 10, 2011). "How Watson "sees," "hears," and "speaks" to play Jeopardy!". IBM Research News. Retrieved February 21, 2011.
- ^ Avery, Lise (February 14, 2011). "Interview with Actor Jeff Woodman, Voice of IBM's Watson Computer" (MP3). Anything Goes!!. Archived from the original on September 21, 2019. Retrieved February 15, 2011.
- ^ Kosinski, Robert J. (2008). "A Literature Review on Reaction Time". Clemson University. Archived from the original on March 17, 2016. Retrieved January 10, 2016.
- ^ a b Baker (2011), p. 174.
- ^ Baker (2011), p. 178.
- ^ a b c Strachan, Alex (February 12, 2011). "For Jennings, it's a man vs. man competition". The Vancouver Sun. Archived from the original on February 21, 2011. Retrieved February 15, 2011.
- ^ Baker (2011), pp. 6–8.
- ^ Baker (2011), p. 30.
- ^ Radev, Dragomir R.; Prager, John; Samn, Valerie (2000). "Ranking potential answers to natural language questions" (PDF). Proceedings of the 6th Conference on Applied Natural Language Processing. Archived from the original (PDF) on 2011-08-26. Retrieved 2011-02-23.
- ^ Prager, John; Brown, Eric; Coden, Annie; Radev, Dragomir R. (July 2000). "Question-answering by predictive annotation" (PDF). Proceedings, 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. Archived from the original (PDF) on 2011-08-23. Retrieved 2011-02-23.
- ^ Leopold, George (July 18, 2007). "IBM's Paul Horn retires, Kelly named research chief". EE Times. Archived from the original on June 3, 2020. Retrieved May 27, 2020.
- ^ Babcock, Charles (October 14, 2015). "IBM Cognitive Colloquium Spotlights Uncovering Dark Data". InformationWeek. Archived from the original on June 3, 2020. Retrieved May 27, 2020.
- ^ Brodkin, Jon (February 10, 2010). "IBM's Jeopardy-playing machine can now beat human contestants". Network World. Archived from the original on June 3, 2013. Retrieved February 19, 2011.
- ^ Zimmer, Ben (February 17, 2011). "Is It Time to Welcome Our New Computer Overlords?". The Atlantic. Archived from the original on August 29, 2018. Retrieved February 17, 2011.
- ^ Raz, Guy (January 28, 2011). "Can a Computer Become a Jeopardy! Champ?". National Public Radio. Archived from the original on February 28, 2011. Retrieved February 18, 2011.
- ^ "Medical Students Offer Expertise to IBM's Jeopardy!-Winning Computer Watson as It Pursues a New Career in Medicine" (PDF). InTouch. 18. New York Medical College: 4. June 2012. Archived from the original (PDF) on 2012-11-23.
- ^ "'Millionaire' quiz whiz Toutant had passion for trivia, Austin's arts scene". Archived from the original on 2021-09-23. Retrieved 2021-09-23.
- ^ Stelter, Brian (December 14, 2010). "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars". The New York Times. Retrieved December 14, 2010.
- ^ Baker (2011), p. 171.
- ^ Flatow, Ira (February 11, 2011). "IBM Computer Faces Off Against 'Jeopardy' Champs". Talk of the Nation. National Public Radio. Archived from the original on February 17, 2011. Retrieved February 15, 2011.
- ^ Sostek, Anya (February 13, 2011). "Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match". Pittsburgh Post Gazette. Archived from the original on February 17, 2011. Retrieved February 19, 2011.
- ^ a b c Jennings, Ken (February 16, 2011). "My Puny Human Brain". Slate. Newsweek Interactive Co. LLC. Archived from the original on February 18, 2011. Retrieved February 17, 2011.
- ^ Baker (2011), p. 117.
- ^ Baker (2011), pp. 232–258.
- ^ Dignan, Larry (January 13, 2011). "IBM's Watson wins Jeopardy practice round: Can humans hang?". ZDnet. Archived from the original on January 13, 2011. Retrieved January 13, 2011.
- ^ a b "The IBM Challenge Day 1". Jeopardy. Season 27. Episode 23. February 14, 2011.
- ^ Lenchner, Jon (February 3, 2011). "Knowing what it knows: selected nuances of Watson's strategy". IBM Research News. IBM. Archived from the original on February 16, 2011. Retrieved February 16, 2011.
- ^ Johnston, Casey (February 15, 2011). "Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek". Ars Technica. Archived from the original on February 18, 2011. Retrieved February 15, 2011.
- ^ a b c d "Computer crushes the competition on 'Jeopardy!'". Associated Press. February 15, 2011. Archived from the original on February 19, 2011. Retrieved February 19, 2011.
- ^ Tesauro, Gerald (February 13, 2011). "Watson's wagering strategies". IBM Research News. IBM. Archived from the original on February 18, 2011. Retrieved February 18, 2011.
- ^ Staff (February 15, 2011). "IBM's computer wins 'Jeopardy!' but... Toronto?". CTV News. Archived from the original on November 27, 2012. Retrieved February 15, 2011.
- ^ a b Robertson, Jordan; Borenstein, Seth (February 16, 2011). "For Watson, Jeopardy! victory was elementary". The Globe and Mail. The Associated Press. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
- ^ Hamm, Steve (February 15, 2011). "Watson on Jeopardy! Day Two: The Confusion over and Airport Clue". A Smart Planet Blog. Archived from the original on October 24, 2011. Retrieved February 21, 2011.
- ^ Johnston, Casey (February 15, 2011). "Creators: Watson has no speed advantage as it crushes humans in Jeopardy". Ars Technica. Archived from the original on February 18, 2011. Retrieved February 21, 2011.
- ^ "IBM Watson: Final Jeopardy! And the Future of Watson". YouTube. 16 February 2011.
- ^ Oberman, Mira (February 17, 2011). "Computer creams human Jeopardy! champions". Vancouver Sun. Agence France-Presse. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
- ^ Johnston, Casey (February 17, 2011). "Bug lets humans grab Daily Double as Watson triumphs on Jeopardy". Ars Technica. Archived from the original on February 21, 2011. Retrieved February 21, 2011.
- ^ a b Upbin, Bruce (February 17, 2011). "IBM's Supercomputer Watson Wins It All With $367 Bet". Forbes. Archived from the original on February 21, 2011. Retrieved February 21, 2011.
- ^ Oldenburg, Ann (February 17, 2011). "Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!'". USA Today. Archived from the original on February 20, 2011. Retrieved February 21, 2011.
- ^ "Show 6088 – The IBM Challenge, Day 2". Jeopardy!. February 16, 2011. Syndicated.
- ^ "World Community Grid to benefit from Jeopardy! competition". World Community Grid. February 4, 2011. Archived from the original on January 14, 2012. Retrieved February 19, 2011.
- ^ "Jeopardy! And IBM Announce Charities To Benefit From Watson Competition". IBM Corporation. January 13, 2011. Archived from the original on November 10, 2021. Retrieved February 19, 2011.
- ^ "IBM's Watson supercomputer crowned Jeopardy king". BBC News. February 17, 2011. Archived from the original on February 18, 2011. Retrieved February 17, 2011.
- ^ Markoff, John (February 16, 2011). "Computer Wins on 'Jeopardy!': Trivial, It's Not". The New York Times. Yorktown Heights, New York. Archived from the original on October 22, 2014. Retrieved February 17, 2011.
- ^ Searle, John (February 23, 2011). "Watson Doesn't Know It Won on 'Jeopardy!'". The Wall Street Journal. Archived from the original on November 10, 2021. Retrieved July 26, 2011.
- ^ Lohr, Steve (December 5, 2011). "Creating AI based on the real thing". The New York Times. Archived from the original on November 10, 2021. Retrieved February 26, 2017..
- ^ a b "NJ congressman tops 'Jeopardy' computer Watson". Associated Press. March 2, 2011. Archived from the original on March 7, 2011. Retrieved March 2, 2011.
- ^ a b c Schwerin, Mac (May 5, 2023). "America Forgot About IBM Watson. Is ChatGPT Next?". The Atlantic. Archived from the original on May 5, 2023.
- ^ Weber, Robert C. (February 14, 2011). "Why 'Watson' matters to lawyers". The National Law Review. Archived from the original on September 8, 2019.
- ^ a b c Lohr, Steve (2022-01-21). "IBM is selling off Watson Health to a private equity firm". The New York Times. ISSN 0362-4331. Archived from the original on 2022-01-21. Retrieved 2025-03-20.
- ^ a b c Lohr, Steve (2021-07-16). "What Ever Happened to IBM's Watson?". The New York Times. ISSN 0362-4331. Archived from the original on 2021-07-16. Retrieved 2025-03-20.
- ^ Lohr, Steve (2016-10-17). "IBM Is Counting on Its Bet on Watson, and Paying Big Money for It". The New York Times. ISSN 0362-4331. Retrieved 2025-03-20.
- ^ Ungerleider, Neal (April 23, 2014). "The North Face Testing Watson-Powered Virtual Personal Shoppers". Fast Company. Archived from the original on January 17, 2019.
- ^ Hesseldahl, Arik (February 12, 2014). "First Investment by IBM's Watson Fund Is for Welltok". Vox. Archived from the original on September 28, 2021.
- ^ Yang, Ina (July 13, 2015). "Caviar + Mango: Chef Watson Wants You to Cook Outside the Comfort Zone". NPR. Archived from the original on July 14, 2015.
- ^ O'Brien, Terrence (2015-06-20). "Watson's South American spin on a Canadian classic". Engadget. Archived from the original on 2020-05-11. Retrieved 2025-03-20.
- ^ Johnson, Scott K. (July 6, 2016). "IBM's Watson Fed Images to Estimate Water Use Efficiency in California". Ars Technica. Archived from the original on July 6, 2016.
- ^ a b Hardawar, Devindra (May 5, 2015). "IBM's big bet on Watson is paying off with more apps and DNA analysis". Engadget. Archived from the original on September 29, 2020.
- ^ Liu, Bin; Liao, Yuanyuan (2025-01-31). "Integrating IBM Watson BEAT generative AI software into flute music learning: the impact of advanced AI tools on students' learning strategies". Education and Information Technologies. doi:10.1007/s10639-025-13394-y. ISSN 1573-7608.
- ^ Jancer, Matt (26 August 2016). "IBM's Watson Takes On Yet Another Job, as a Weather Forecaster". Smithsonian. Archived from the original on 1 September 2016. Retrieved 29 August 2016.
- ^ Booton, Jennifer (15 June 2016). "IBM finally reveals why it bought The Weather Company". Market Watch. Archived from the original on 22 August 2016. Retrieved 29 August 2016.
- ^ Plenty, Rebecca (October 25, 2016). "Pearson Taps IBM's Watson as a Virtual Tutor for College Students". No. October 25, 2016. Bloomberg. Bloomberg. Archived from the original on September 27, 2017. Retrieved 26 September 2017.
- ^ Moscaritolo, Angela (2 February 2017). "H&R Block Enlists IBM Watson to Find Tax Deductions". PC Magazine. Archived from the original on 15 February 2017. Retrieved 14 February 2017.
- ^ Lohr, Steve (2021-07-16). "What Ever Happened to IBM's Watson?". The New York Times. ISSN 0362-4331. Archived from the original on 2021-07-16. Retrieved 2025-03-20.
- ^ Schwerin, Mac (May 5, 2023). "America Forgot About IBM Watson. Is ChatGPT Next?". The Atlantic. Archived from the original on May 5, 2023.
- ^ a b c d Strickland, Eliza (April 2, 2019). "How IBM Watson Overpromised and Underdelivered on AI Health Care". IEEE Spectrum. Archived from the original on July 30, 2021.
- ^ Yu, Jea (April 10, 2023). "Back from the Dead, IBM's Watson AI is Alive and Re-Emerging". MarketBeat / Nasdaq. Archived from the original on April 26, 2023.
- ^ "IBM Watson is AI for Business". IBM. 9 July 2024.
- ^ a b c "Putting Watson to Work: Watson in Healthcare". IBM. Archived from the original on November 11, 2013. Retrieved November 11, 2013.
- ^ "IBM Watson Helps Fight Cancer with Evidence-Based Diagnosis and Treatment Suggestions" (PDF). IBM. Archived from the original (PDF) on April 26, 2013. Retrieved November 12, 2013.
- ^ Saxena, Manoj (February 13, 2013). "IBM Watson Progress and 2013 Roadmap (Slide 7)". IBM. Archived from the original on November 13, 2013. Retrieved November 12, 2013.
- ^ "MD Anderson Taps IBM Watson to Power "Moon Shots" Mission Aimed at Ending Cancer, Starting with Leukemia" (Press release). IBM. Archived from the original on 2017-02-21. Retrieved 2017-02-20.
- ^ a b Miliard, Mike (October 30, 2012). "Watson Heads to Medical School: Cleveland Clinic, IBM Send Supercomputer to College". Healthcare IT News. Archived from the original on November 11, 2013. Retrieved November 11, 2013.
- ^ Mathews, Anna Wilde (September 12, 2011). "Wellpoint's New Hire: What is Watson?". The Wall Street Journal. Archived from the original on February 22, 2017. Retrieved March 12, 2017.
- ^ "IBM's Jeopardy! Stunt Computer Is Curing Cancer Now". New York Magazine. November 23, 2016.
- ^ "MD Anderson Benches IBM Watson In Setback For Artificial Intelligence In Medicine". Forbes. Archived from the original on 2017-10-02. Retrieved 2017-09-18.
- ^ ANI (2016-10-28). "Manipal Hospitals to adopt IBM's 'Watson for Oncology' supercomputer for cancer treatment". Business Standard India. Archived from the original on 2017-01-18. Retrieved 2017-01-17.
- ^ Goel, Vindu (2017-09-28). "IBM Now Has More Employees in India Than in the U.S." The New York Times. ISSN 0362-4331. Retrieved 2025-03-20.
- ^ a b c "IBM Watson Group Unveils Cloud-Delivered Watson Services to Transform Industrial R&D, Visualize Big Data Insights and Fuel Analytics Exploration" (Press release). IBM. January 9, 2014. Archived from the original on October 12, 2020. Retrieved February 14, 2020.
- ^ Ante, Spencer E. (January 9, 2014). "IBM Set to Expand Watson's Reach". The Wall Street Journal. Archived from the original on May 9, 2015. Retrieved January 9, 2014.
- ^ "Inside the Lab". September 2017. Archived from the original on October 23, 2020. Retrieved October 6, 2020.
- ^ "IBM CEO Rometty Proposes 'Watson's Law': AI In Everything" Archived 2021-04-16 at the Wayback Machine, Adrian Bridgewater, Forbes, March 20, 2018
Bibliography
[edit]- Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. ISBN 978-0-547-48316-0.
Further reading
[edit]- Baker, Stephen (2012) Final Jeopardy: The Story of Watson, the Computer That Will Transform Our World, Mariner Books.
- Jackson, Joab (2014). IBM bets big on Watson-branded cognitive computing PCWorld: Jan 9, 2014 2:30 PM
- Greenemeier, Larry. (2013). Will IBM's Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 |* Lazarus, R. S. (1982).
- Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing
External links
[edit]![]() | This article's use of external links may not follow Wikipedia's policies or guidelines. (May 2022) |
- Watson homepage
- DeepQA homepage
- About Watson on Jeopardy.com
- Smartest Machine on Earth (PBS NOVA documentary about the making of Watson)
- Power Systems
- The Watson Trivia Challenge. The New York Times. June 16, 2010.
- This is Watson – IBM Journal of Research and Development (published by the IEEE)
J! Archive
[edit]- Jeopardy! Show #6086 – Game 1, Part 1
- Jeopardy! Show #6087 – Game 1, Part 2
- Jeopardy! Show #6088 – Game 2
Videos
[edit]- PBS NOVA documentary on the making of Watson
- Building Watson – A Brief Overview of the DeepQA Project on YouTube (21:42), IBMLabs
- How Watson Answers a Question on YouTube
- David Ferrucci, Dan Cerutti and Ken Jennings on IBM's Watson at Singularity Summit 2011 on YouTube
- A Computer Called Watson on YouTube – November 15, 2011, David Ferrucci at Computer History Museum, alternate
- IBM Watson and the Future of Healthcare on YouTube – 2012
- IBM Watson-Introduction and Future Applications on YouTube – IBM at EDGE 2012
- IBM Watson for Healthcare on YouTube – Martin Kohn, 2013
- IBM Watson playlist, IBMLabs Watson playlist