Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- :-module(ordered, [background_knowledge/2
- ,metarules/2
- ,positive_example/2
- ,negative_example/2
- ]).
- :-use_module(configuration).
- /** <module> A longer example for Veedrac on HN.
- Initiate a multi-predicate learning session to learn all the targets
- jointly. Use dynamic learning to learn each target incrementally, by
- reusing each clause learned in an earlier iteration.
- ordered/3 is true when len(n1) < len(n2) < len(n3):
- ==
- ?- time(learn_dynamic([llength/2,shorter/2,ordered/3])).
- llength([],0).
- llength(A,B):-tail(A,C),p(B,D),llength(C,D).
- shorter(A,B):-llength(A,C),llength(B,D),s(C,D).
- ordered(A,B,C):-shorter(A,B),shorter(B,C).
- % 19,682 inferences, 0.016 CPU in 0.009 seconds (172% CPU, 1259648 Lips)
- true.
- ==
- ordered_leq/3 is true when len(n1) =< len(n2) =< len(n3):
- ==
- ?- time(learn_dynamic([llength/2,shorter/2,ordered_leq/3])).
- llength([],0).
- llength(A,B):-tail(A,C),p(B,D),llength(C,D).
- shorter(A,B):-llength(A,C),llength(B,D),leq(C,D).
- ordered_leq(A,B,C):-shorter(A,B),shorter(B,C).
- % 41,861 inferences, 0.016 CPU in 0.011 seconds (142% CPU, 2679104 Lips)
- true.
- ==
- If something fails, check that the following configuration options are
- set:
- ==
- ?- list_config.
- example_clauses(call)
- experiment_file(data/drafts/functions/ordered.pl,ordered)
- learner(louise)
- max_invented(0)
- minimal_program_size(2,inf)
- recursion_depth_limit(dynamic_learning,50000)
- recursive_reduction(false)
- reduction(plotkins)
- resolutions(5000)
- symbol_range(predicate,[P,Q,R,S,T])
- symbol_range(variable,[X,Y,Z,U,V,W])
- theorem_prover(resolution)
- unfold_invented(false)
- true.
- ==
- */
- % ================================================================================
- % Metarule constraints. This is actually a misnomer- those are _clause_
- % constraints, used to eliminate unwanted clauses. We use them here
- % mainly for aesthetical reasons.
- % ================================================================================
- % This constraint cuts left-recursive shorter/2 clauses that make it
- % fiddly to run the learned hypothesis in Prolog.
- configuration:metarule_constraints(M,fail):-
- M =.. [m,Id,P|Ps]
- ,Id \= projection
- ,left_recursive(P,Ps).
- left_recursive(T,[T|_Ps]):-
- !.
- left_recursive(T,[T,T|_Ps]):-
- !.
- % Cuts clause re-definitng llength/2 by shorter/2:
- % llength(A,B):-shorter(A,C),s(B,D),llength(C,D).
- % This would be fine in an exploratory session, say.
- configuration:metarule_constraints(M,fail):-
- M =.. [m,_Id,llength,shorter|_].
- % Avoids definition of shorter/2 by tail/2:
- % shorter(A,B):-tail(A,C),tail(B,D),shorter(C,D).
- % This compares the tails of two lists, which is the same as comparing
- % the full lists, so it's correct, but again it's more useful in an
- % exploratory session.
- configuration:metarule_constraints(M,fail):-
- M =.. [m,_Id,shorter,tail|_].
- % ================================================================================
- % Various configs
- % ================================================================================
- % Sets the maximum number of predicates allowed to be invented.
- % It's set to 0 because we don't need to invent any. We still use
- % dynamic learning for for the ability to learn incrementally by reusing
- % predicates learned (or invented) in earlier learning steps.
- :- auxiliaries:set_configuration_option(max_invented, [0]).
- % ================================================================================
- % Declarations of Meta-Interpretive Learning problem elements
- % Elements of a MIL problem are: metarules, background knowledge (BK)
- % and examples.
- % ================================================================================
- % Metarules for the experiment are derived from Prolog program
- % skeletons. They are essentially "program recipes" that can be reused
- % to construct different programs.
- configuration: list_rec_func metarule 'P(x,y):- Q(x,z),R(y,u),P(z,u)'.
- configuration: triadic_chain metarule 'P(x,z,y):- Q(x,z), R(z,y)'.
- configuration: list_comp metarule 'P(x,y):- Q(x,z), R(y,u), S(z,u)'.
- % Metarule declarations - though specific metarules are declared for
- % each learning target, multi-predicate learning allows Louise to use
- % any of the declared metarules for any target.
- metarules(llength/2,[abduce,list_rec_func]).
- metarules(shorter/2,[list_comp]).
- metarules(ordered/3,[triadic_chain]).
- metarules(ordered_leq/3,[triadic_chain]).
- % Background knowledge (BK) declarations - again, multi-predicate
- % learning means any BK predicate can be used for any target. For
- % example, we could leave the two latter lists empty and add s/2 to the
- % first.
- background_knowledge(llength/2, [tail/2,p/2]).
- background_knowledge(shorter/2, [s/2]).
- % ground_peano/1 is added here so it's reported by list_mil_problem/1
- background_knowledge(ordered/3, [ground_peano/1]).
- background_knowledge(ordered_leq/3, [leq/2,ground_peano/1]).
- % Positive examples are declared separately for each target. They are
- % added to the BK during learning, so any learning target can be defined
- % in terms of these partial, extensional definitions of other targets.
- positive_example(llength/2,E):-
- member(E, [llength([],0)
- ,llength([a],s(0))
- ]
- ).
- positive_example(shorter/2,shorter(Xs,Ys)):-
- member(Xs-Ys,[[a]-[b,c]
- ,[1,2]-[3,4,5]
- ]
- ).
- positive_example(ordered/3,ordered(S1,S2,S3)):-
- member([S1,S2,S3],[[[a],[b,c],[d,e,f]]
- ]
- ).
- positive_example(ordered_leq/3,ordered_leq(S1,S2,S3)):-
- member([S1,S2,S3],[[[a],[b],[d]]
- ,[[a],[b,c],[d,e,f]]
- ,[[a],[c],[e,f,g,h,i]]
- ]
- ).
- % Negative examples
- negative_example(shorter/2,_):-
- fail.
- negative_example(llength/2,_):-
- fail.
- negative_example(ordered/3,ordered(S1,S2,S3)):-
- % These are not strictly needed - but I add them to address Veedrac's
- % concern that Louise is not learning a sound program for ordered/2.
- member([S1,S2,S3],[[[a],[b],[c]]
- ,[[a,b,c],[a,b],[c]]
- ]
- ).
- negative_example(ordered_leq/3,ordered(S1,S2,S3)):-
- % These are not strictly needed - but I add them to address Veedrac's
- % concern that Louise is not learning a sound program for ordered/2.
- member([S1,S2,S3],[[[a,b,c],[a,b],[c]]
- ]
- ).
- % ================================================================================
- % Background knowledge predicate definitions.
- %
- % BK in Louise consists of arbitrary Prolog programs. Strictly speaking,
- % they must have definite datalog heads, but this is not enforced in
- % practice. I like to see BK as a kind of programming library,
- % containing sub-routines, sub-programs from which the target programs
- % can be composed.
- %
- % Metarules on the other hand must be definite datalog and Louise _does_
- % enforce this restriction. The predicates below are used to allow
- % Louise to process non-datalog examples, i.e. examples with function
- % symbols in their head or body literals. In a sense, the function
- % symbols are "hidden away" in the BK. The technique is similar to
- % "flattening": Rouveirol, C, Machine Learning, 1994:
- %
- % https://paperity.org/p/7456046/flattening-and-saturation-two-representation-changes-for-generalization
- %
- % The predicates below have universal applicability for problems
- % involving lists and numbers. head/2 and tail/2 split a list, s/2 and
- % p/2 are used to "dereference" Peano number functions.
- %
- % Note hat head/2 is not actually used above.
- % ================================================================================
- %! head(?List,?Head) is det.
- %
- % True when Head is the head of list List.
- %
- head([H|_T],H).
- %! tail(?List,?Tail) is det.
- %
- % True when Tail is the tail of list List.
- %
- tail([_H|T],T).
- %! leq(?N,?M) is nondet.
- %
- % True when N =< M, where N, M are Peano integers.
- %
- leq(X,X):-
- ground_peano(X).
- leq(X,Y):-
- s(X,Y).
- %! p(?Number, ?Predecessor) is nondet.
- %
- % A Peano Number and its Predecessor.
- %
- p(s(N),0):-
- ground_peano(N).
- p(s(N),N).
- p(s(N),s(M)):-
- ground_peano(N)
- ,p(N,M).
- %! s(?Number, ?Successor) is nondet.
- %
- % A Peano Number and its Successor.
- %
- s(0,s(N)):-
- ground_peano(N).
- s(N,s(N)).
- s(s(M),s(N)):-
- ground_peano(N)
- ,s(M,N).
- %! ground_peano(?N) is det.
- %
- % True when N is a ground Peano number.
- %
- ground_peano(X):-
- ground(X)
- ,\+ is_list(X).
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement