Guest User

Untitled

a guest
Mar 4th, 2018
296
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 13.62 KB | None | 0 0
  1. # Smart Learning [SmartLearning] (c) (r) (tm)
  2. # Copyright Dean Van Greunen 2018 to infinity, All rights Reserved.
  3.  
  4.  
  5. # Smart Learning is an Artificial Intelligence System which was designed and inspired by:
  6. # - The Human Brain
  7. # - The Deep Learning System
  8. # - The MatLab Programming Language
  9. # - Javascript Programming Language with Server Code Execution design of Node.js
  10. # - The Go Programming Language
  11. # Author: Dean Van Greunen
  12. # Email: deanvg9000@gmail.com
  13. # 027 885 5371 (for South Africa local calls Only)
  14. # +27 72 885 5371 (for International calls)
  15.  
  16. class SmartLearning:
  17. #################
  18. # NN class Code # (this is a SmartLearning NN)
  19. #################
  20. # Neural Network, Simple and elegant, unconventional,
  21. # however it is:
  22. # - organized
  23. # - well formated
  24. # - easy to (while running):
  25. # - import
  26. # - export
  27. # - modify
  28. # - supports:
  29. # - Unbounded distance between neurons
  30. # - Unbounded hooks (neurons, talking to other neurons, more just means more processing) [only allowed math ops are: + - / *], if more complicated functions are need
  31. # then that means it will be like such,
  32. # [
  33. # {
  34. # "a": {
  35. # "b":"a+1",
  36. # "c":"b*0.90"
  37. # },
  38. # "d": {
  39. # "e":"c+1",
  40. # "f":"e-0.90"
  41. # },
  42. # },
  43. # ]
  44. # This NN will take a single node input from "a" and get our output for the required f in this example the value can be anything and it will evaluate to:
  45. # ((((a+1)*0.90)+1)-0.90)
  46. # let a = 2:
  47. # ((((2+1)*0.90)+1)-0.90) = ((((3)*0.90)+1)-0.90) = ((2.7+1)-0.90) = (3.7 - 0.90) = (2.8) = 2.8
  48. # after x time, it could be optimized by access to:
  49. # [
  50. # {
  51. # "a": {
  52. # "f":"(((a+1)*0.90)+1)-0.90"
  53. # },
  54. # },
  55. # ]
  56. # this access time would go from, 8 node accesses to 2 which is
  57. class NN:
  58. # format of
  59. # NN = [{"x": {"y"}}] or [{"x": {"y":"y+1", "y+n":"y+n*2.90"}}]
  60. # NN = [
  61. # {"x": {"y":"x*0.87"}}, # maps x to y which is mapped by (x * 0.87)
  62. # {"x+1": {"y":"x*0.67"}} # maps x+1 to y (not math but the Xn var...) which is mapped by (x * 0.67) as a math op using parent var...
  63. # ]
  64. # or
  65. # [
  66. # {"x": {"y":"y+1", "y+n":"y*0.90"},
  67. # {"x+1": {"y":"y+1", "y+n":"y*1.09"}} # maps x+1 to y (not math but the Xn var...)
  68. # ]
  69. working_neurons = [] # Working Memory, always optimized for performance, has pointers to memory_neurons as mem_x or mem_y, name for x and y are not relevant and can change,
  70. # as it is only a mathematical function.
  71. hard_memory_neurons = [] # Memory, not optimized, when saving the working neurons it is then un-optimized and then added to memory.
  72. def reset(self): # good for wiping / clear working memory, a.k.a temp memory, it will reset to what it was just before "waking up"
  73. self.neurons = [] # Clears ALL Neurons In Memory, a.k.a "Black Out"
  74.  
  75. def wipe(self): # WARNING HARD RESET, MAKES A COMPLETELY EMPTY NN, like a brand new NN, useful if NN has gone rouge.
  76. self.hard_memory_neurons = [] # Clears ALl Neurons in Old
  77. self.reset()
  78.  
  79. def map(self, n_x, n_y, steps=1): # Solves for Y by X using brute force in steps, if set to 1, it will do a initial rough, y = x from current memory.
  80. # Note with this method "learning" takes as long as a human would with the exception of speed which an electronic computing device would provide
  81. # Over what a biological computing device would perform at (the brain, human or not)
  82. # TODO: Map Properly, as [X->Y]....Must perform a search, each output is mapped to its unique input, etc... and any math logic is re-used
  83. pass # This mapping function is not available in this version of the code, please contact deanvg9000@gmail.com for help or support or donations for this project
  84.  
  85. def optimize(): # this optimizes memory into working memory.
  86. pass # This optimiz(e/ing) function is not available in this version of the code, please contact deanvg9000@gmail.com for help or support or donations for this project
  87.  
  88. def save(self, soft_save=true, hard_save=true, path="", filename="default.pyai-nn"):
  89. # Soft Load, loads only working memory.
  90. if soft_save:
  91. self.soft_save(path, filename)
  92. # Hard Load, Loads
  93. if hard_save:
  94. self.hard_save(path, filename)
  95.  
  96. def soft_load(self, path="", filename="default.pyai-nn"):
  97. # Loads Optimized data
  98. f_mem_opt_data = open(path + filename + "-opt", "w+")
  99. f_mem_opt_data.write(toJSON(self.working_neurons))
  100. f_mem_opt_data.close()
  101.  
  102. def hard_load(self, path="", filename="default.pyai-nn"):
  103. # Loads Complete UnOptimized Data
  104. f_mem_data = open(path + filename + "-mem", "w+")
  105. f_mem_data.write(toJSON(self.hard_memory_neurons))
  106. f_mem_data.close()
  107.  
  108. def load(self, soft_load=true, hard_load=true, path="", filename="default.pyai-nn"):
  109. # Soft Load, loads only working memory.
  110. if soft_load:
  111. self.soft_load(path, filename)
  112. # Hard Load, Loads
  113. if hard_load:
  114. self.hard_load(path, filename)
  115.  
  116. def soft_load(self, path="", filename="default.pyai-nn"):
  117. # Loads Optimized data
  118. f_mem_opt_data = open(path + filename + "-opt", "w+")
  119. mem_opt_data = f_mem_opt_data.read()
  120. f_mem_opt_data.close()
  121. self.working_neurons = fromJSON(mem_opt_data)
  122.  
  123. def hard_load(self, path="", filename="default.pyai-nn"):
  124. # Loads Complete UnOptimized Data
  125. f_mem_data = open(path + filename + "-mem", "w+")
  126. mem_data = f_mem_data.read()
  127. f_mem_data.close()
  128. self.hard_memory_neurons = fromJSON(mem_data)
  129.  
  130. def smartlearn(self):
  131. pass
  132.  
  133. ####################
  134. # EVent class Code #
  135. ####################
  136. # Event, Clever way to exchange data and format it without actually changing it, just the way it is perceived.
  137. class Event:
  138. chains = [] # All events to chain with this event
  139. func = None # function to call after formating is complete
  140. formatting = None # formatting function to call, passes in data, expects data
  141. def init(self, func=None, formatting=None, chains=[]): # Events can be empty
  142. # This creates a data formating matrix, which takes the func to be called and applies the formatting as matrix which converts the data, events can be chained.
  143. if func:
  144. self.func = func
  145. else:
  146. self.func = self.ret
  147. if formatting:
  148. self.formatting = formatting
  149. else:
  150. self.formatting = self.ret
  151. self.chains = chains
  152.  
  153. def bindEvent(self, event): # binds an event to chain (stack first in, first called)
  154. self.chains[] = event
  155.  
  156. def call(self, data):
  157. if len(self.chains) >= 1: # check if there are items in the chain
  158. for ev in self.chains: # loop through events in chain
  159. if type(ev) is SmartLearning().Event(): # check if item in chain is an event else skip it
  160. ev.call(data) # if any Subscribing Events have chains they will all be called as well.
  161. self.func(self.formatting(data)) # Data is not destroyed
  162.  
  163. def ret(self, x):
  164. return x
  165.  
  166. #####################
  167. # Input class Code #
  168. #####################
  169. # Input which has a NN, which monitors the current environment
  170. class Input:
  171. parent_events = SmartLearning().Event()
  172. stream = None
  173. queue = []
  174.  
  175. def init(self):
  176. startThread(self.notifier)
  177.  
  178. def setStream(self, stream):
  179. self.stream = stream
  180.  
  181. def bindEvent(self, event):
  182. self.parent_events.bindEvent(event)
  183.  
  184. def notifier(self):
  185. while self.stream:
  186. if self.stream.canFetch():
  187. self.queue = self.stream.fetch()
  188. for i,q in self.queue:
  189. self.stream.pop(q)
  190. self.queue[i] = None
  191.  
  192.  
  193. #####################
  194. # Output class Code #
  195. #####################
  196. # Output which has a NN, affects the current environment
  197. class Output:
  198. parent_events = SmartLearning().Event()
  199. stream = None
  200. queue = []
  201. def init(self):
  202. startThread(self.sender)
  203.  
  204. def setStream(self, stream):
  205. self.stream = stream
  206.  
  207. def bindEvent(self, event):
  208. self.parent_events.bindEvent(event)
  209.  
  210. def sender(self):
  211. while self.stream:
  212. if self.stream.canPush():
  213. for i,q in self.queue:
  214. self.stream.push(q)
  215. self.queue[i] = None
  216.  
  217. ##########################
  218. # Environment class Code #
  219. ##########################
  220. # Environment contains input and output which both have NNs
  221. class Environment:
  222. io = None
  223. parent_events = SmartLearning().Event()
  224. ev_notify = SmartLearning().Event()
  225. def init(self):
  226. self.io = {
  227. "i": SmartLearning().Input(),
  228. "o": SmartLearning().Output()
  229. }
  230. self.ev_notify.func = self.notify
  231. self.io["i"].init()
  232. self.io["o"].init()
  233. self.io["i"].bindEvent(self.ev_notify)
  234. self.io["o"].bindEvent(self.ev_notify)
  235.  
  236. def bindEvent(self, event):
  237. self.parent_events.bindEvent(event)
  238.  
  239. def notify(self, data):
  240. self.parent_events.call(data)
  241.  
  242.  
  243. #########################
  244. # Processing class Code #
  245. #########################
  246. # Processor contains the environment, which contains input and output which both have NNs
  247. class Processing:
  248. nn = None
  249. environment = None
  250. parent_events = None
  251. ev_notify = None
  252. def init(self):
  253. self.nn = SmartLearning().NN()
  254. self.environment = SmartLearning().Environment()
  255. self.parent_events = SmartLearning().Event()
  256. self.ev_notify = SmartLearning().Event()
  257. self.environment.init()
  258.  
  259. def bindIO(self, i, o):
  260. self.environment.io["i"].setStream(i)
  261. self.environment.io["o"].setStream(o)
  262.  
  263. def bindEvent(self, event):
  264. self.parent_events.bindEvent(event)
  265.  
  266. def notify(self, data):
  267. self.parent_events.call(data)
  268.  
  269. ###################
  270. # Main class Code #
  271. ###################
  272. processor = None
  273. parent_events = None
  274. ev_notify = None
  275. def init(self):
  276. self.processor = SmartLearning().Processing()
  277. self.parent_events = SmartLearning().Event()
  278. self.ev_notify = SmartLearning().Event()
  279. self.processor.init()
  280.  
  281. def bindEvent(self, event):
  282. self.parent_events.bindEvent(event)
  283.  
  284. def notify(self, data):
  285. self.parent_events.call(data)
  286.  
  287. def loadStreams(self, i_stream, o_stream):
  288. self.processor.bindIO(i_stream, o_stream)
  289.  
  290. def boot(self): # loads hard memory only (usually done a few times during AI Up time or after hitting the kill switch)
  291. self.processor.hard_memory_neurons = []
  292. self.processor.load(soft_load=False, hard_load=False)
  293. self.wake()
  294.  
  295. def wake(self): # loads working memory only.
  296. self.processor.working_neurons = []
  297. self.processor.load(soft_load=True, hard_load=False)
  298.  
  299. def sleep(self): # slowly pauses tasks one by one from least to most important,
  300. # and then de-optimizes working memory and store in into hard memory,
  301. # then saves hard memory and clears optimied memory
  302. self.processor.optimize()
  303. self.processor.save(soft_save=True, hard_save=False)
  304. self.processor.working_neurons = []
  305.  
  306. def shutdown(self): # runs sleep and then clears both hard and working memory after saving, then it shutdown the AI completely.
  307. # useful for debugging and going through the "AI's brain" data files
  308. self.sleep()
  309. self.processor.save(soft_save=False, hard_save=True)
  310. self.processor.hard_memory_neurons = []
  311. quit()
  312. pass
  313.  
  314. def kill(self): # Crashes AI, good kill switch for a rouge AI
  315. quit()
  316.  
  317.  
  318.  
  319. #############################################
  320. # USAGE DEMO #
  321. #############################################
  322. # Note Run the following in a different thread, while keeping access to sl_ai
  323.  
  324. # Create SmartLearning AI System
  325. sl_ai = SmartLearning()
  326. # Initialize SubSystems
  327. sl_ai.init()
  328.  
  329. # Load The World Information Stream (o_stream)
  330. # AI Actor/Agent Control Stream (i_stream)
  331. sl_ai.loadStreams(i_stream=StreamObject, o_stream=StreamObject)
  332.  
  333. # Start Up AI for the first time (Booting)
  334. sl_ai.boot()
  335.  
  336. # every 3 to 9 hours do a
  337. # sl_ai.sleep()
  338. # sl_ai.wake()
  339.  
  340. # The AI should be functional now. it will take rough 9 to 48 human months to learn how to speak, and walk.
  341. # after 20 years roughly it will be able to program a version of it's self.
Add Comment
Please, Sign In to add comment