Advertisement
logicmoo

Untitled

Aug 13th, 2018
487
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Prolog 13.91 KB | None | 0 0
  1. % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %
  2. %     Extracts from the book "Natural Language Processing in Prolog"    %
  3. %                      published by Addison Wesley                      %
  4. %        Copyright (c) 1989, Gerald Gazdar & Christopher Mellish.       %
  5. % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %
  6.  
  7. Pragmatics
  8.  
  9. We can use Prolog inference directly for simple prediction.  We define
  10. a predicate ''say'' that can be given a set of assertions that might be
  11. derived from an English sentence.
  12.  
  13.     say(X) :- X, !.
  14.     say(X) :- asserta(X).
  15.  
  16. It will add them to its world model if they are not already predicted.
  17. Assertions may contain variables, corresponding to pronouns that need
  18. to be resolved, in which case ''say'' uses Prolog''s backwards inference
  19. to try and infer substitutions for the pronouns.  Here is a tiny set
  20. of example inference rules, based on the Morton Underwell exercise:
  21.  
  22.     church(church_of(Town)) :- town(Town).
  23.     see(people,Object) :- famous(Object), object(Object).
  24.     object(Spire) :- spire(Spire).
  25.     object(Spire) :- spire(Church).
  26.     spire(spire_of(Church)) :- church(Church).
  27.  
  28. The predicate ''say'' can be used by a person simulating the output of a
  29. natural language front end, providing lists of assertions in the order
  30. they might be generated from a text:
  31.  
  32.     ?- say(town(morton_underwell)).
  33.     yes
  34.     ?- say(spire(It)).
  35.     It = spire_of(church_of(morton_underwell)) ?
  36.     yes
  37.     ?- say(famous(spire_of(church_of(morton_underwell)))).
  38.     yes
  39.     ?- say(see(people,It)).
  40.     It = spire_of(church_of(morton_underwell)) ?
  41.     yes
  42.  
  43. In this example, the prediction program is able to predict the
  44. existence of the spire on the church in Morton Underwell, and is able
  45. to predict that it is this, rather than the church or the town, that
  46. people probably come to see.
  47.  
  48. We can easily represent a script in Prolog, with a predicate ''script''                    
  49. whose first argument is a term giving the name of the script with
  50. variables for the key players, and whose second argument is a list of
  51. terms, each of which represents an event occurring in sequence.  In
  52. addition, we can directly associate conditions on the type of entity
  53. that can substitute for the variables in the script.  When humans go
  54. to a restaurant, they (normally) pay for the food eaten, and are thus
  55. customers.  When cats visit restaurants, they do not pay, and are thus
  56. not customers.
  57.  
  58. Here are a couple of scripts represented this way:
  59.  
  60.     script(restaurant(Customer,Restaurant,Food),
  61.         [
  62.         enters(Customer,Restaurant),
  63.         calls(Customer,Server),
  64.         brings(Server,Food),
  65.         eats(Customer,Food),
  66.         pays(Customer,Teller),
  67.         leaves(Customer,Restaurant)
  68.         ]) :-
  69.         human(Customer),
  70.         human(Server),
  71.         human(Teller),
  72.         restaurant(Restaurant),
  73.         food(Food).
  74.  
  75.     script(cat_burglary(Cat,Location,Food),
  76.         [
  77.         enters(Cat,Location),
  78.         picks_up(Cat,Food),
  79.         leaves(Cat,Location),
  80.         eats(Cat,Food),
  81.         sleeps(Cat)
  82.         ]) :-
  83.         cat(Cat),
  84.         food(Food).
  85.  
  86. And here is some of the relevant "typing" information that our scripts
  87. need to make use of:
  88.  
  89.     cat(a_cat).
  90.     cat(garfield).
  91.     food(some_food).
  92.     food(lasagne).
  93.     human(a_person).
  94.     human(kim).
  95.     restaurant(a_restaurant).                    
  96.     restaurant(pizzahut).
  97.  
  98. The code we need to use these scripts to "understand" stories is
  99. pretty trivial -- find a script, then match the story against the
  100. events in the script:
  101.  
  102.     understand(Story,Summary,Events) :-
  103.         script(Summary,Events),
  104.         match(Story,Events).
  105.  
  106. Here ''match'' imply attempts to unify each event in the story with some
  107. event in the script, subject to the constraint that the order of
  108. events in the story is consistent with that in the script:
  109.  
  110.     match([],Events).
  111.     match([Event1|Events1],[Event1|Events2]) :-
  112.         match(Events1,Events2).
  113.     match([Event1|Events1],[Event2|Events2]) :-
  114.         match([Event1|Events1],Events2).
  115.  
  116. That essentially completes our program (in scripts.pl) which
  117. implements a simple script applier in Prolog.  It differs from actual
  118. script appliers, in that it expects to be given the set of
  119. propositions from a whole text, rather than having to work out
  120. plausible scripts at the end of each sentence.  Moreover, it
  121. incorporates no special mechanism for script cueing -- each script is
  122. assumed to be potentially applicable.
  123.  
  124. Here are a couple of examples of the program in use:
  125.  
  126.     1.  The story:
  127.           enters(kim, pizzahut)
  128.           eats(She, lasagne)
  129.        Summary:
  130.           restaurant(kim, pizzahut, lasagne)
  131.        Chronology:
  132.           enters(kim, pizzahut)
  133.           calls(kim, a_person)
  134.           brings(a_person, lasagne)
  135.           eats(kim, lasagne)
  136.           pays(kim, a_person)
  137.           leaves(kim, pizzahut)
  138.  
  139.     2.  The story:
  140.           enters(garfield, pizzahut)
  141.           eats(He, lasagne)
  142.        Summary:
  143.           cat_burglary(garfield, pizzahut, lasagne)
  144.        Chronology:
  145.           enters(garfield, pizzahut)
  146.           picks_up(garfield, lasagne)
  147.           leaves(garfield, pizzahut)
  148.           eats(garfield, lasagne)                    
  149.           sleeps(garfield)
  150.  
  151. In these examples, the program is given lists of propositions that
  152. might arise from texts like the following:
  153.  
  154.     1. Kim went to Pizza Hut.  She had a lasagne.
  155.     2. Garfield snuck into Pizza Hut.  He ate the lasagne.
  156.  
  157. In the latter case, because Garfield is a cat, and not a person, the
  158. program has decided that a "cat_burglary" script is applicable to the
  159. sequence of actions.  In doing so, it has obtained the referent of the
  160. pronoun, and although the text only mentions two actions, the script
  161. indicates that in fact five actions took place, including Garfield
  162. picking up the lasagne, leaving the restaurant, and sleeping.
  163.  
  164. Language generation as a goal-oriented process      
  165.  
  166. PLAN_GEN.PL implements a simple planner that can develop plans such as
  167. these two examples.  Operators are represented in Prolog with 4
  168. arguments, the first being the name of the operator bundled up as a
  169. term with its arguments, the second is a list of CAN_DO preconditions,
  170. the third a list of WANT preconditions, and the fourth a list of
  171. effects.  The use of lists for the last three arguments allows us to
  172. express multiple preconditions and effects.  Here is our REQUEST
  173. operator in this notation:
  174.  
  175.     operator(request(Speaker,Addressee,Action),
  176.        [Addressee can_do Action,channel(Speaker,Addressee)],
  177.        [Speaker wants request(Speaker,Addressee,Action)],
  178.        [Addressee believes Speaker wants Action]).
  179.  
  180. Note that we have defined ''believes'', ''wants'' and ''can_do'' as infix
  181. operators.  We represent the initial state of the world for the
  182. planner with a predicate ''initial'' which provides a list of facts
  183. describing the world in question.  So that we can have clauses for
  184. several examples in the database, we provide ''initial'' with a first
  185. (numeric) argument to identify the example.
  186.  
  187.     initial(1,[
  188.        channel(sue,alan),
  189.        at(alan,inside),
  190.        at(sue,inside)
  191.        ]).
  192.  
  193. Goals as expressed by the ''goal'' predicate have three arguments: the                    
  194. number of the example, the name of the agent whose goal it is, and the
  195. goal itself:
  196.  
  197.     goal(1,sue,at(alan,outside)).
  198.  
  199. Although the planner plays the part of a particular agent in building
  200. the plan and therefore reasons within the beliefs of that agent, for
  201. convenience we will express the world model in an impartial way, with
  202. all belief contexts explicitly spelled out.  Moreover, in presenting
  203. goals to the planner, we will omit the "AGENT believes" that should
  204. precede each one.  Of course, if the goals involve predicates about
  205. which there is universal knowledge, then there is no distinction
  206. anyway.
  207.  
  208. Depth-first search, which is Prolog''s default search strategy, is
  209. actually a bad search strategy for planning, because there are usually
  210. infinitely many ways of achieving a given goal and arbitrarily long
  211. plans to investigate.  For instance, if Sue wants Alan to move out of
  212. the room, she could simply make a request, or go out of the room and
  213. make the request, or go out of the room, come back into the room and
  214. make the request, and so on.  Depth-first search is unlikely to find
  215. the best plan, which is probably the shortest plan, and it is liable
  216. to get into infinite loops trying to generate longer and longer plans.
  217. Because of this problem, the PLAN_GEN.PL program uses a modified
  218. depth-first strategy.  It uses ''listlen'' to generating uninstantiated
  219. lists of actions of increasing length.  The first attempt tries an
  220. empty list of actions, then a one action list, then a two action list,
  221. and so on, until a plan of the given length is found.  Although this
  222. approach (known as iterative deepening) involves a lot of duplicated
  223. work, it does ensure that the program finds a shortest plan and does
  224. not get into infinite loops.
  225.  
  226.     planner(Id) :-
  227.           listlen(Actions,_),
  228.           initial(Id,World),
  229.           goal(Id,Agent,Goal),
  230.           plan(Agent,Goal,World,Actions,[],Done).
  231.  
  232. Here is a trace of a successful search for a plan applicable to the
  233. goal and initial world of example 1, as given in the examples above:
  234.  
  235.     ?- planner(1).
  236.  
  237.     [trying, plan, length, 0]
  238.      [trying, plan, length, 1]
  239.      [trying, operator, move(alan, _1, outside)]
  240.       [trying, plan, length, 2]
  241.       [trying, operator, move(alan, _2, outside)]
  242.       [trying, operator, cause_to_want(_3, alan, move(alan, inside, outside))]
  243.       [trying, operator, move(alan, _4, _2)]
  244.        [trying, plan, length, 3]
  245.        [trying, operator, move(alan, _5, outside)]
  246.        [trying, operator, cause_to_want(_6, alan, move(alan, inside, outside))]                    
  247.        [trying, operator, request(_6, alan, move(alan, inside, outside))]
  248.  
  249.     ** plan is:
  250.  
  251.        [request(sue, alan, move(alan, inside, outside)),
  252.         cause_to_want(sue, alan, move(alan, inside, outside)),
  253.         move(alan, inside, outside)]
  254.  
  255. The plan returned is a list representing the sequence of actions that
  256. should be performed (in the order given) to achieve the goal.  The
  257. predicate ''plan'' takes the following arguments:
  258.  
  259.     (i) AGENT - the agent who is doing the planning,
  260.     (ii) GOAL - the current goal,
  261.     (iii) WORLD - the world in which the goal is to be pursued (i.e.
  262.     the list of facts that are true in that world),
  263.     (iv) ACTIONS1, ACTIONS2 - a difference list encoding the plan
  264.     necessary to achieve the goal,
  265.     (v) DONE - a flag to signal termination.
  266.  
  267. The first main possibility for it to investigate is whether the GOAL
  268. is already true.  Since we are planning from the point of view of a
  269. particular AGENT, we actually need to test whether the AGENT believes
  270. that it is true, unless the goal involves a predicate that is
  271. universally known.  Predicate ''db_goal'' produces a new version of
  272. GOAL, DBG, which has an "AGENT believes" prefix if the goal does not
  273. involve a predicate about which there is universal knowledge:
  274.  
  275.     plan(Agent,Goal,World,Actions,Actions,Done) :-
  276.        db_goal(Agent,Goal,DBG),
  277.        member(DBG,World).
  278.  
  279. The other main possibility is to find an operator that could achieve
  280. the goal by one of its effects.  For such an operator, the program
  281. tries planning to achieve the preconditions (CAN_DOs and WANTs),
  282. calling the predicate ''allplan'' to produce the various subplans:
  283.  
  284.     plan(Agent,Goal,World,[Action|Actions0],Actions2,Done) :-
  285.        operator(Action,Can_do,Want,Effects),
  286.        member(Goal,Effects),
  287.        write([trying,operator,Action]), nl,
  288.        allplan(Agent,Can_do,World,Actions0,Actions1,Done),
  289.        allplan(Agent,Want,World,Actions1,Actions2,Done).
  290.  
  291. The predicate ''allplan'' basically works like ''plan'', except that is
  292. expects a list of goals, rather than a single goal.  It attempts to
  293. achieve them in sequence:
  294.  
  295.     allplan(Agent,[],World,Actions,Actions,Done).
  296.     allplan(Agent,[Goal|Goals],World,Actions0,Actions2,Done) :-
  297.        plan(Agent,Goal,World,Actions0,Actions1,Done),
  298.        allplan(Agent,Goals,World,Actions1,Actions2,Done).
  299.  
  300. Although we have now convered the main parts of the program, the                    
  301. program also deals with various special cases.  For instance, in order
  302. to achieve "can_do(X,Y)", it looks up the CAN_DO preconditions for
  303. action "Y" and tries to achieve these.  The planner also knows
  304. explicitly about the predicates about which there is universal
  305. knowledge.
  306.  
  307. The program implemented in PLAN_GEN.PL is a restricted planner in the
  308. following respect.  It chains backwards from the desired goal to an
  309. action that will achieve that goal; then it looks for an unsatisfied
  310. precondition of that action and finds an action that will make that
  311. precondition true; then it looks for an unsatisfied precondition of
  312. that action; and so on, until an action is reached, all of whose
  313. preconditions are initially true.  It then stops (the stopping is
  314. implemented by instantiating the "Done" variable that is passed to all
  315. the ''plan'' and ''allplan'' goals; ''allplan'' actually checks whether this
  316. variable is instantiated and, if so, stops doing any further
  317. planning).  At this point, the list of actions it has chained through
  318. gives a rough idea of a sequence of actions to be performed, but there
  319. is no guarantee that any but the first of these can actually be
  320. performed.  It is quite possible, for instance, that a remaining
  321. unsatisfied precondition of one of the actions cannot be made true in
  322. any way.  Or that for some reason one of the effects of carrying out
  323. one of the actions is to make one of the other actions impossible.
  324. The key feature of a planning system that this program lacks is the
  325. ability to reason properly about the changes in the world that result
  326. from performing an action and how these interact with the
  327. preconditions and effects of other planned actions.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement