TheDeanVanGreunen

SmartLearning AI By Dean Van Greunen

Mar 3rd, 2018
221
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 11.90 KB | None | 0 0
  1. # Smart Learning [SmartLearning] (c) (r) (tm)
  2. # Copyright Dean Van Greunen 2018 to infinity, All rights Reserved.
  3.  
  4.  
  5. # Smart Learning is an Artificial Intelligence System which was designed and inspired by:
  6. #   - The Human Brain
  7. #   - The Deep Learning System
  8. #   - The MatLab Programming Language
  9. #   - Javascript Programming Language with Server Code Execution design of Node.js
  10. #   - The Go Programming Language
  11. # Author: Dean Van Greunen
  12. # 027 885 5371 (for South Africa local calls Only)
  13. # +27 72 885 5371  (for International calls)
  14.  
  15. class SmartLearning:
  16.     #################
  17.     # NN class Code #   (this is a SmartLearning NN)
  18.     #################
  19.     # Neural Network, Simple and elegant, unconventional,
  20.     # however it is:
  21.     #  - organized
  22.     #  - well formated
  23.     #  - easy to (while running):
  24.     #       - import
  25.     #       - export
  26.     #       - modify
  27.     #  - supports:
  28.     #       - Unbounded distance between neurons
  29.     #       - Unbounded hooks (neurons, talking to other neurons, more just means more processing) [only allowed math ops are: + - / *], if more complicated functions are need
  30.     #         then that means it will be like such,    
  31.     #           [
  32.     #               {
  33.     #                   "a": {
  34.     #                       "b":"a+1",
  35.     #                       "c":"b*0.90"
  36.     #                   },
  37.     #                   "d": {
  38.     #                       "e":"c+1",
  39.     #                       "f":"e-0.90"
  40.     #                   },
  41.     #               },
  42.     #           ]
  43.     #           This NN will take a single node input from "a" and get our output for the required f in this example the value can be anything and it will evaluate to:
  44.     #               ((((a+1)*0.90)+1)-0.90)
  45.     #               let a = 2:
  46.     #                   ((((2+1)*0.90)+1)-0.90) = ((((3)*0.90)+1)-0.90) = ((2.7+1)-0.90) = (3.7 - 0.90) = (2.8) = 2.8
  47.     #           after x time, it could be optimized by access to:
  48.     #           [
  49.     #               {
  50.     #                   "a": {
  51.     #                       "f":"(((a+1)*0.90)+1)-0.90"
  52.     #                   },
  53.     #               },
  54.     #           ]
  55.     #           this access time would go from, 8 node accesses to 2 which is
  56.     class NN:
  57.         # format of
  58.         # NN = [{"x": {"y"}}] or [{"x": {"y":"y+1", "y+n":"y+n*2.90"}}]
  59.         # NN = [
  60.         #       {"x": {"y":"x*0.87"}},   # maps x to y which is mapped by (x * 0.87)
  61.         #       {"x+1": {"y":"x*0.67"}}  # maps x+1 to y (not math but the Xn var...) which is mapped by (x * 0.67) as a math op using parent var...
  62.         #   ]
  63.         # or
  64.         #  [
  65.         #       {"x": {"y":"y+1", "y+n":"y*0.90"},
  66.         #       {"x+1": {"y":"y+1", "y+n":"y*1.09"}} # maps x+1 to y (not math but the Xn var...)
  67.         #  ]
  68.         working_neurons = []    # Working Memory, always optimized for performance, has pointers to memory_neurons as mem_x or mem_y, name for x and y are not relevant and can change,
  69.                                 # as it is only a mathematical function.
  70.         hard_memory_neurons = []        # Memory, not optimized, when saving the working neurons it is then un-optimized and then added to memory.
  71.         def reset(self): # good for wiping / clear working memory, a.k.a temp memory, it will reset to what it was just before "waking up"
  72.             self.neurons = []   # Clears ALL Neurons In Memory, a.k.a "Black Out"
  73.  
  74.         def wipe(self): # WARNING HARD RESET, MAKES A COMPLETELY EMPTY NN, like a brand new NN, useful if NN has gone rouge.
  75.             self.hard_memory_neurons = []  # Clears ALl Neurons in Old
  76.             self.reset()
  77.  
  78.         def map(self, n_x, n_y, steps=1): # Solves for Y by X using brute force in steps, if set to 1, it will do a initial rough, y = x from current memory.
  79.             #                               Note with this method "learning" takes as long as a human would with the exception of speed which an electronic computing device would provide
  80.             #                               Over what a biological computing device would perform at (the brain, human or not)
  81.             # TODO: Map Properly, as [X->Y]....Must perform a search, each output is mapped to its unique input, etc... and any math logic is re-used
  82.             pass # This mapping function is not available in this version of the code, please contact [email protected] for help or support or donations for this project
  83.  
  84.         def optimize(): # this optimizes memory into working memory.
  85.             pass # This optimiz(e/ing) function is not available in this version of the code, please contact [email protected] for help or support or donations for this project
  86.  
  87.         def save(self, soft_save=true, hard_save=true, path="", filename="default.pyai-nn"):
  88.             # Soft Load, loads only working memory.
  89.             if soft_save:
  90.                 self.soft_save(path, filename)
  91.             # Hard Load, Loads
  92.             if hard_save:
  93.                 self.hard_save(path, filename)
  94.        
  95.         def soft_load(self, path="", filename="default.pyai-nn"):
  96.             # Loads Optimized data
  97.             f_mem_opt_data = open(path + filename + "-opt", "w+")
  98.             f_mem_opt_data.write(toJSON(self.working_neurons))
  99.             f_mem_opt_data.close()
  100.        
  101.         def hard_load(self, path="", filename="default.pyai-nn"):
  102.             # Loads Complete UnOptimized Data
  103.             f_mem_data = open(path + filename + "-mem", "w+")
  104.             f_mem_data.write(toJSON(self.hard_memory_neurons))
  105.             f_mem_data.close()
  106.  
  107.         def load(self, soft_load=true, hard_load=true, path="", filename="default.pyai-nn"):
  108.             # Soft Load, loads only working memory.
  109.             if soft_load:
  110.                 self.soft_load(path, filename)
  111.             # Hard Load, Loads
  112.             if hard_load:
  113.                 self.hard_load(path, filename)
  114.        
  115.         def soft_load(self, path="", filename="default.pyai-nn"):
  116.             # Loads Optimized data
  117.             f_mem_opt_data = open(path + filename + "-opt", "w+")
  118.             mem_opt_data = f_mem_opt_data.read()
  119.             f_mem_opt_data.close()
  120.             self.working_neurons = fromJSON(mem_opt_data)
  121.        
  122.         def hard_load(self, path="", filename="default.pyai-nn"):
  123.             # Loads Complete UnOptimized Data
  124.             f_mem_data = open(path + filename + "-mem", "w+")
  125.             mem_data = f_mem_data.read()
  126.             f_mem_data.close()
  127.             self.hard_memory_neurons = fromJSON(mem_data)
  128.            
  129.         def smartlearn(self):
  130.             pass
  131.  
  132.     ####################
  133.     # EVent class Code #
  134.     ####################
  135.     # Event, Clever way to exchange data and format it without actually changing it, just the way it is perceived.
  136.     class Event:
  137.         chains = []  # All events to chain with this event
  138.         func = None  # function to call after formating is complete
  139.         formatting = None  # formatting function to call, passes in data, expects data
  140.         def init(self, func=None, formatting=None, chains=[]):  # Events can be empty
  141.             # This creates a data formating matrix, which takes the func to be called and applies the formatting as matrix which converts the data, events can be chained.
  142.             if func:
  143.                 self.func = func
  144.             else:
  145.                 self.func = self.ret
  146.             if formatting:
  147.                 self.formatting = formatting
  148.             else:
  149.                 self.formatting = self.ret
  150.             self.chains = chains
  151.        
  152.         def bindEvent(self, event):  # binds an event to chain (stack first in, first called)
  153.             self.chains[] = event
  154.  
  155.         def call(self, data):
  156.             if len(self.chains) >= 1:  # check if there are items in the chain
  157.                 for ev in self.chains: # loop through events in chain
  158.                     if type(ev) is SmartLearning().Event():  # check if item in chain is an event else skip it
  159.                         ev.call(data)  # if any Subscribing Events have chains they will all be called as well.
  160.             self.func(self.formatting(data)) # Data is not destroyed
  161.  
  162.         def ret(self, x):
  163.             return x
  164.  
  165.     #####################
  166.     # Input class Code  #
  167.     #####################
  168.     # Input which has a NN, which monitors the current environment
  169.     class Input:
  170.         parent_events = SmartLearning().Event()
  171.         stream = None
  172.         queue = []
  173.  
  174.         def init(self):
  175.             startThread(self.notifier)
  176.  
  177.         def setStream(self, stream):
  178.             self.stream = stream
  179.  
  180.         def bindEvent(self, event):
  181.             self.parent_events.bindEvent(event)
  182.  
  183.         def notifier(self):
  184.             while self.stream:
  185.                 if self.stream.canFetch():
  186.                     self.queue = self.stream.fetch()
  187.                     for i,q in self.queue:
  188.                         self.stream.pop(q)
  189.                         self.queue[i] = None
  190.  
  191.  
  192.     #####################
  193.     # Output class Code #
  194.     #####################
  195.     # Output which has a NN, affects the current environment
  196.     class Output:
  197.         parent_events = SmartLearning().Event()
  198.         stream = None
  199.         queue = []
  200.         def init(self):
  201.             startThread(self.sender)
  202.  
  203.         def setStream(self, stream):
  204.             self.stream = stream
  205.  
  206.         def bindEvent(self, event):
  207.             self.parent_events.bindEvent(event)
  208.  
  209.         def sender(self):
  210.             while self.stream:
  211.                 if self.stream.canPush():
  212.                     for i,q in self.queue:
  213.                         self.stream.push(q)
  214.                         self.queue[i] = None
  215.    
  216.     ##########################
  217.     # Environment class Code #
  218.     ##########################
  219.     # Environment contains input and output which both have NNs
  220.     class Environment:
  221.         io = None
  222.         parent_events = SmartLearning().Event()
  223.         ev_notify = SmartLearning().Event()
  224.         def init(self):
  225.             self.io = {
  226.                 "i": SmartLearning().Input(),
  227.                 "o": SmartLearning().Output()
  228.             }
  229.             self.ev_notify.func = self.notify
  230.             self.io["i"].init()
  231.             self.io["o"].init()
  232.             self.io["i"].bindEvent(self.ev_notify)
  233.             self.io["o"].bindEvent(self.ev_notify)
  234.  
  235.         def bindEvent(self, event):
  236.             self.parent_events.bindEvent(event)
  237.  
  238.         def notify(self, data):
  239.             self.parent_events.call(data)
  240.  
  241.  
  242.     #########################
  243.     # Processing class Code #
  244.     #########################
  245.     # Processor contains the environment, which contains input and output which both have NNs
  246.     class Processing:
  247.         nn = None
  248.         environment = None
  249.         parent_events = None
  250.         ev_notify = None
  251.         def init(self):
  252.             self.nn = SmartLearning().NN()
  253.             self.environment = SmartLearning().Environment()
  254.             self.parent_events = SmartLearning().Event()
  255.             self.ev_notify = SmartLearning().Event()
  256.             self.environment.init()
  257.  
  258.         def bindIO(self, i, o):
  259.             self.environment.io["i"].setStream(i)
  260.             self.environment.io["o"].setStream(o)
  261.  
  262.         def bindEvent(self, event):
  263.             self.parent_events.bindEvent(event)
  264.  
  265.         def notify(self, data):
  266.             self.parent_events.call(data)
  267.    
  268.     ###################
  269.     # Main class Code #
  270.     ###################
  271.     processor = None
  272.     parent_events = None
  273.     ev_notify = None
  274.     def init(self):
  275.         self.processor = SmartLearning().Processing()
  276.         self.parent_events = SmartLearning().Event()
  277.         self.ev_notify = SmartLearning().Event()
  278.         self.processor.init()
  279.    
  280.     def bindEvent(self, event):
  281.         self.parent_events.bindEvent(event)
  282.    
  283.     def notify(self, data):
  284.         self.parent_events.call(data)
  285.    
  286.     def loadStreams(self, i_stream, o_stream):
  287.         self.processor.bindIO(i_stream, o_stream)
  288.  
  289.     def boot(self): # loads hard memory only (usually done a few times during AI Up time or after hitting the kill switch)
  290.         self.processor.hard_memory_neurons = []
  291.         self.processor.load(soft_load=False, hard_load=False)
  292.         self.wake()
  293.  
  294.     def wake(self):  # loads working memory only.
  295.         self.processor.working_neurons = []
  296.         self.processor.load(soft_load=True, hard_load=False)
  297.  
  298.     def sleep(self):  # slowly pauses tasks one by one from least to most important,
  299.         #               and then de-optimizes working memory and store in into hard memory,
  300.         #               then saves hard memory and clears optimied memory  
  301.         self.processor.optimize()
  302.         self.processor.save(soft_save=True, hard_save=False)
  303.         self.processor.working_neurons = []
  304.  
  305.     def shutdown(self): # runs sleep and then clears both hard and working memory after saving, then it shutdown the AI completely.
  306.         #                 useful for debugging and going through the "AI's brain" data files
  307.         self.sleep()
  308.         self.processor.save(soft_save=False, hard_save=True)
  309.         self.processor.hard_memory_neurons = []
  310.         quit()
  311.         pass
  312.  
  313.     def kill(self): # Crashes AI, good kill switch for a rouge AI
  314.         quit()
  315.  
  316.  
  317.  
  318. #############################################
  319. #               USAGE DEMO                  #
  320. #############################################
  321. # Note Run the following in a different thread, while keeping access to sl_ai
  322.  
  323. # Create SmartLearning AI System
  324. sl_ai = SmartLearning()
  325. # Initialize SubSystems
  326. sl_ai.init()
  327.  
  328. # Load The World Information Stream (o_stream)
  329. # AI Actor/Agent Control Stream (i_stream)
  330. sl_ai.loadStreams(i_stream=StreamObject, o_stream=StreamObject)
  331.  
  332. # Start Up AI for the first time (Booting)
  333. sl_ai.boot()
  334.  
  335. # every 3 to 9 hours do a
  336. # sl_ai.sleep()
  337. # sl_ai.wake()
  338.  
  339. # The AI should be functional now. it will take rough 9 to 48 human months to learn how to speak, and walk.
  340. # after 20 years roughly it will be able to program a version of it's self.
Add Comment
Please, Sign In to add comment