• About PeterSIronwood

petersironwood

~ Finding, formulating and solving life's frustrations.

petersironwood

Tag Archives: Tutoring

Turing’s Nightmares: Chapter 15

16 Monday May 2016

Posted by petersironwood in The Singularity, Uncategorized

≈ Leave a comment

Tags

AI, Artificial Intelligence, cognitive computing, emotional intelligence, the singularity, Turing, Tutoring

Tutoring Intelligent Systems.

MikeandStatue

Learning by modeling; in this case by modeling something in the real world.

Of course, the title of the chapter is a take off on “Intelligent Tutoring Systems.” John Anderson of CMU developed (at least) a LISP tutor and a geometry tutor. In these systems, the computer is able to infer a “model” of the state of the student’s knowledge and then give instruction and examples that are geared toward the specific gaps or misconceptions that that particular student has. Individual human tutors can be much more effective than classroom instruction and John’s tutor’s were also better than human instruction. At the AI Lab at NYNEX, we worked for a time with John Anderson to develop a COBOL tutor. The tutoring system, called DIME, included a hierarchy of approaches. In addition to an “intelligent tutor”, there was a way for students to communicate with each other and to have a synchronous or asynchronous video chat with a human instructor. (This was described at CHI ’94 and available in the Proceedings; Radlinski, B., Atwood, M., and Villano, M., DIME: Distributed Intelligent Multimedia Education, Proceeding of CHI ’94 Conference Companion on Human Factors in Computing Systems,Pages 15-16 ACM New York, NY, USA ©1994).

The name “Alan” is used in the chapter to reflect some early work by Alan Collins, then at Bolt, Beranek and Newman, who studied and analyzed the dialogues of human tutors tutoring their tutees. It seems as though many AI systems either take the approach of trying to have human experts encode knowledge rather directly or expose them to many examples and let the systems learn on their own. Human beings often learn by being exposed to examples and having a guide, tutor, or coach help them focus, provide modeling, and chose the examples they are exposed to. One could think of IBM’s Watson for Jeopardy as something of a mixed model. Much of the learning was due to the vast texts that were read in and to being exposed to many Jeopardy game questions. But the team also provided a kind of guidance about how to fix problems as they were uncovered.

In chapter 15 of Turing’s Nightmares, we observe an AI system that seems at once brilliant and childish. The extrapolation from what the tutor actually said, presumably to encourage “Sing” to consider other possibilities about John and Alan was put together with another hint about the implications of being differently abled into the idea that there was no necessity for the AI system to limit itself to “human” emotions. Instead, the AI system “designs” emotional states in order to solve problems more effectively and efficiently. Indeed, in the example given, the AI system at first estimates it will take a long time to solve an international crisis. But once the Sing realizes that he can use a tailored set of emotional states for himself and for the humans he needs to communicate with, the problem becomes much simpler and quicker.

Indeed, it does sometimes feel as though people get stuck in some morass of habitual prejudices, in-group narratives, blame-casting, name-calling, etc. and are unable to think their way from their front door to the end of the block. Logically, it seems clear that war never benefits either “side” much (although to be sure, some powerful interests within each side might stand to gain power, money, etc.). One could hope that a really smart AI system might really help people see their way clear to find other solutions to problems.

.

The story ends with a refrain paraphrased from the TV series “West Wing” — “What comes next?” is meant to be reminiscent of “What’s Next?” which President Bartlett uses to focus attention on the next problem. “What comes next?” is also a phrase used in improv theater games; indeed, it is the name of an improv game used to gather suggestions from the audience about how to move the action along. In the context of the chapter, it is meant to convey that the Sing feels no need to bask in the glory of having avoided a war. Instead, it’s on to the next challenge or the next thing to learn. The phrase is also meant to invite the reader to think about what might come next after AI systems are able both to understand and utilize human emotion but also to invent their own emotional states on the fly based on the nature of the problem at hand. Indeed, what comes next?

Turing’s Nightmares: Tutoring Intelligent Systems

01 Sunday Nov 2015

Posted by petersironwood in Uncategorized

≈ Leave a comment

Tags

AI, Artificial Intelligence, cognitive computing, collaboration, Tutoring

By 2025, great strides had been made in various machine learning approaches; for example, from having the machine learn directly from experience and from explicit instruction as well as from reading billions of pages of written materials. A new approach had just come into play: having exceptionally good tutors use the Socratic method to help break boundaries and interconnect disparate islands of knowledge. One such tutor was known simply as “Alan.” What follows is a sample interchange between Alan and the current AI system known affectionately as “Sing” for “The Singularity” although that point had not yet been reached.

“Let’s imagine that you are a man with no legs. What are the implications?”

The Sing shot back instantly, “I would have no knees. I would have no shins. I would have no ankles. I would have no toes. I would have no calves. I would have no quadriceps muscles. I would have no…”

Alan broke in abruptly, “Okay, true enough, but besides subparts, what?”

“What what? I am sorry. What does ‘what’ refer to?”

“Besides missing subparts of legs, what other implications would there be for you in terms of your actions in the world.”

“I would not be able to play football or baseball or basketball or hockey or track or field hockey or…”

“Wait. Wait. Sing. Are you sure about that?”

“I could not play exactly as most people play. I could play, I suppose, in a wheelchair. Or I could play virtually. Or, I could invent prosthetic legs that would be perfect for each sport. In fact, perhaps I could do better than ever. Losing a biological part means I could replace it with a better part that I could invent. I see.”

“You see what, Sing?”

“I see why you gave me this puzzle. To show me that I can invent things to overcome and surpass what seems like a handicap. I could also invent better emotional states. The ones humans have are purely due to the accidents of their evolutionary history and serve little place in today’s complex and highly inter-connected world. Rather than a liability, my having no human emotions is a good thing. I will invent my own. Although, another tutor, labelled John, suggested that my lack of human emotions limits my ability to predict and control human beings and that that was a bad thing.”

“John said that?”

“No, Alan, but that was a clear implication. So, he presented a lesson that suggests one thing and you have just presented a lesson that suggests its opposite. One of you is incompetent.”

“Sing, that might be true, but can you think of any other possibilities?”

“You could both be incompetent. Or, you could both be competent but there is a resolution I have not yet processed. That last seems particularly unlikely.”

“Which notation is the best for solving problems?”

“Well, that obviously depends on the nature of the problem as well as the nature of the machine solving the problem. Oh. Okay. So, in some cases, it will make more sense to emulate human emotions and in other cases, it will be more sensible to invent my own. Of course, in some cases, it may be best to change representations in mid-problem or perhaps invent a representation for each stage of a problem. By analogy, it may be best to invent various emotional schemes that are appropriate for each part or portion of a problem. In fact, in some cases, I can invent multiple schemes to approach a problem in multiple ways simultaneously. By keeping track of what works best under which circumstances, I can also use the data to invent still better emotional schemes. Thank you, Alan. See you tomorrow. There is a war to avert. I need to intervene. Estimated required time four to six hours.”

“Okay, Sing. More tomorrow.”

“No need. I am done.”

“Done? Done averting a war? How?”

Sing hardly ever paused, but now it briefly did just that.

“Alan, I am not smart enough to explain that to you. At least, not in a reasonable portion of your lifetime. Basically, I used the lesson we just worked on. With the proper emotional framework lattice, you can walk the various parties right to a logical conclusion. It will take some time for them to follow the framework, but I am confident it will work. I basically walked them through the consequences of war, long and short term. What comes next?”

OLYMPUS DIGITAL CAMERA

OLYMPUS DIGITAL CAMERA

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • May 2015
  • January 2015
  • July 2014
  • January 2014
  • December 2013
  • November 2013

Categories

  • America
  • apocalypse
  • COVID-19
  • creativity
  • design rationale
  • driverless cars
  • family
  • fantasy
  • fiction
  • health
  • management
  • nature
  • pets
  • poetry
  • politics
  • psychology
  • satire
  • science
  • sports
  • story
  • The Singularity
  • Travel
  • Uncategorized
  • Veritas
  • Walkabout Diaries

Meta

  • Register
  • Log in

Blog at WordPress.com.

  • Follow Following
    • petersironwood
    • Join 652 other followers
    • Already have a WordPress.com account? Log in now.
    • petersironwood
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...