Advertisement
djgdfgafjaslkfasf

Untitled

Dec 15th, 2019
519
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 25.84 KB | None | 0 0
  1. 1-1. What type does foo() return?
  2.  
  3. foo takes a function that takes an I and returns an IResult<I, O, E>
  4. But now foo will spit out a similar function where O became Option<O>
  5.  
  6. 1-2. What type the type of bar?
  7.  
  8. bar is of type F, so Fn(I) -> IResult<I, O, E>
  9.  
  10. 1-3. What is the return type of the following expression:
  11.  
  12. foo(bar)(i) is of type IResult<I, Option<O>, E>
  13.  
  14. 1-4. What is the type of j?
  15.  
  16. j is of type Option<O>
  17.  
  18. 2.
  19. Y = (λx.λy.y (x x y)) (λx.λy.y (x x y))
  20. Yz = ((λx.λy.y(x x y)) (λx.λy.y(x x y))z)
  21.  
  22. (λx.λy.y (x x y)) (λx.λy.y (x x y)) z
  23. (λy.y ((λx.λy.y (x x y)) (λx.λy.y (x x y)) y)) z
  24. z ((λx.λy.y (x x y)) (λx.λy.y (x x y)) z)
  25.  
  26. Yz = z(Yz) = zz(Yz) = zzz(Yz) = zz...z(Yz)
  27.  
  28. This looks like a spawner, it can create as many 'z's as it wants to, its A VIRUS!!! (idk)
  29.  
  30.  
  31. 3. https://stackoverflow.com/questions/33923/what-is-tail-recursion
  32. Compared to regular recursion, tail recursion is when your recursion is called at the end of your function, and what ends up happening is you do the
  33. computation as you go recursively. In regular/normal recursion, what happens is you only do the computation after you have gone through all the recursive steps.
  34. I know stack overflow isn't completely reliable, but the examples and code they present seem to be in agreement with the community. As we can see here, regular recursion
  35. will keep evaluating the recursion, then do the operation all at the end, while tail recursion does the operation as it goes recursively. The benefit of tail recursion,
  36. is that there is no need for the current stackframe of each function, because the computations/operations are already done for that recursive step.
  37.  
  38. Recusive:
  39. fn sum_slice(s: &[i32]) -> i32 {
  40. if s.is_empty() {
  41. 0
  42. } else {
  43. s[0] + sum_slice(&s[1..])
  44. }
  45. }
  46.  
  47. Tail Recusive:
  48. fn sum_slice(x: i32, s: &[i32]) -> i32 {
  49. if s.is_empty() {
  50. x
  51. } else {
  52. sum_slice(x + s[0], &s[1..])
  53. }
  54. }
  55.  
  56. As you can see inboth examples, the recursive one has s[0] on hold, and continues to do recursion, so what you really end up with is something like this
  57. 5 + s[1, 2, 3, 4]
  58. 5 + 4 + s[1, 2, 3]
  59. ...
  60. 5 + 4 + 3 + 2 + s[1]
  61. 5 + 4 + 3 + 2 + 1
  62.  
  63. And in the tail recursion, the values are passed with you
  64. (0, s[1-5])
  65. (5, s[1-4])
  66. ...
  67. (14, s[1])
  68. (15, 0)
  69.  
  70.  
  71.  
  72. 4. Since I am kind of a math nerd, I'll select FORTRAN for this one.
  73. After seraching a little bit, I was able to find this
  74. https://gist.github.com/t-nissie/479f0f16966925fa29ea
  75.  
  76. This code includes the implementation, the compilation, and a test file.
  77.  
  78. Looking at lines 6 - 10, I can tell that this is a recursive function, it's literally in the name, but I'm not quite sure what subroutine
  79. really is. The parameters being passed in don't seem to have anything indicating what type it really is, but we'll go down further to look
  80. at the other lines if that can tell us anything. On the initialization, we can see that indeed, you don't give them a type in the function parameters
  81. like you normally do, you just pass in the variables, and then you give it the type right after the function. (Quite interesting, never seen anything like this actually).
  82. I am not sure what implicit none means, but I know that real*8 has something to do with an array or arraylist, and everything else in the initialization
  83. seems kind of self explanatory (integer, int).
  84.  
  85. From lines 12 - 29, I would say I kind of have an idea what's going on, but without looking at any FORTRAN sources or other code, the part that is really
  86. tripping me up is how they use the do while loop. Ignoring the quicksort functionality for now (that's because everybody kind of knows what's going on during quicksort),
  87. normally in java or c++, for a do while, you usually put the body, then your loop condition. In this case, they just did a do while loop with the condition right afterwards.
  88. In short, everything in terms of quick sort looks standard, setting the pivot, swapping, then calling recursion, the last 2 statements is the recursion part,
  89. and if you ever have a condition where the place you marked the end of your first array is greater than the place you marked the beginning of your second array, then
  90. it exists, standard recursion stops condition. It's just the do-while loop that is kind of messing with me hard.
  91.  
  92. In the make file, I don't know what else I can say other than this looks very VERY similar to a C/C++ makefile, used to combine multiple object files and link
  93. them all together to form an executable. Of course the syntax is vastly different from C/C++ from what I've done so far, but the format is almost identical.
  94.  
  95. In the test file, program/end program is probably the "main" method, still don't know what implicit none means, call quicksort, and print it out with "write"
  96. function. But yea, I think that's about all the analysis I can do without googling anything relating to FORTRAIN.
  97.  
  98.  
  99.  
  100. 5. In both Rust and Haskell, one of the most simplest examples of lazy evaluation are the if statements. In an if statement, if there are multiple conditions such as
  101. if (A or B), where if either A or B are true, or both are true, the if statement is true, then the body executes. This is an example of lazy evaluation because first off
  102. the program will check if A is true. If A is true, then that implies the whole statmenet is true regardless what B really is. So After knowing A is true, it just skip
  103. figuring out what B is, and evaluates the body of the if statement. However, if A is NOT true, then condition B will be evaluated, and that will determine whether the
  104. whole if statement condition is true or not. This is an example of lazy evaluation in both programming languages.
  105.  
  106. One example of strict evaluation is when you have case matching in both rust and haskell. Let's say I have some matching, where
  107. foo True = 1
  108. foo False = 1
  109.  
  110. this is true. This is definitely strict in some sense, because they both evaluate to the same exact thing, it would have been easier to just type
  111. foo _ = 1
  112.  
  113. but since I set the matching to be that way, where True is 1, or False is 1, the program evaluating that function MUST check whether the given condition is True or False,
  114. regardless if the result is the same or not. Strict evaluation forces the program to evaluate everything in order to come to a conclusion.
  115.  
  116. Looking at it from both sides, the lazy evaluation seems to have the speed edge, but the strict evaluation seems to have the security edge, both holding their pros and cons.
  117.  
  118.  
  119.  
  120. 6. product [x^2 | x<-[1..100]]
  121.  
  122. If I understand set builder notation correctly, what is happening is a new list is created, and as we iterate from 1-100, the numbers are square and
  123. stored into the list. So what you would have now is [1, 4, 9, ..., 100000]. After that is computed, what finally happens is you take the product of
  124. all the numbers in your new list, i.e 1*4*9*16*...*100000.
  125.  
  126. In rust,
  127.  
  128. fn main() {
  129. let x:i32 = (1..=100).map(|n| n*n).product();
  130. print!("{}", x);
  131. }
  132.  
  133. This program runs if I swap .product() with .sum(), so .product() should also work. The reason I got an error when I ran it is because the number
  134. this program produces is too freakin massive. But YES, There is a HUGE problem with this program. THE NUMBER IS WAY TOO BIG. I think you need to import
  135. some special function or package to manage that number.
  136.  
  137. In GHCi, it spits the answer easily. To be exact:
  138. 8709782489089480079416590161944485865569720643940840134215932536243379996346583325877967096332754920644690380762219607476364289411435920190573960677507881394607489905331729758013432992987184764607375889434313483382966801515156280854162691766195737493173453603519594496000000000000000000000000000000000000000000000000
  139.  
  140. Haskell succeeds in generating these incredibly large numbers because integers in haskell are arbitrarily size. It will use up as much memory
  141. as it needs to in order to do computations/operations. In arbitrary sized integers, arithmetic is usually done in arrays or arraylist, where
  142. you do your standard operation, and if the index is filled to the max, you just carry over like you would do back in elementary arithmetic. Depending
  143. if integers are 4byte or 8byte sized, object oriented integers will use memory, as if it has a counting base of 2^32-1 or 2^64-1 (normal coutning system is 10)
  144.  
  145.  
  146.  
  147. 7. https://manishearth.github.io/blog/2015/05/17/the-problem-with-shared-mutability/
  148.  
  149. I was fortunate to find this source break up mutability rules, and shared mutability state into simple understandable code to explain why shared mutability is
  150. incredbibly dangerous. In general, the reason why shared mutablity is inredibly unsafe is because there's the risk of one party modifying the data in a way that
  151. prevents the other one from working properly. From the source above, this code is provided:
  152.  
  153. let mut x = Vec::new();
  154. x.push(1); // Allowed
  155. {
  156. let ptr = &x; // Create an immutable reference
  157. let y = ptr[0]; // Allowed, nobody can mutate
  158. let y = x[0]; // Similarly allowed
  159. x.push(1); // Not allowed (will not compile): as long as `ptr` is active,
  160. // `x` is frozen for mutation
  161. }
  162.  
  163. And clearly, this will definitely give you an issue, because you defined a pointer, or a reference to the mutable vector x, and after you set that pointer, and
  164. attempt to modify what the vector x originally was, the pointer will freak out because the pointer isn't updated on what's going on, only x is. Thus, this program will yield
  165. a huge compiler error, and dislike you for that.
  166.  
  167. There is another simple example I found looking around on the internet, known as the conumser producer problem.
  168. https://en.wikipedia.org/wiki/Producer%E2%80%93consumer_problem
  169.  
  170. According to wikipedia, the consumer producer problem is a problem where both parties share a buffer, the produer produces data, and the consumer consumes data.
  171. The producer can't produce too much data otherwise buffer gets full, and the consumer can't consume too much data otherwise there'll be no data left to consume.
  172. In this problem, there is an specific scenario where a "deadlock" can happen, here are the followign steps for that to happen according to this page.
  173.  
  174. The consumer has just read the variable itemCount, noticed it's zero and is just about to move inside the if block.
  175. Just before calling sleep, the consumer is interrupted and the producer is resumed.
  176. The producer creates an item, puts it into the buffer, and increases itemCount.
  177. Because the buffer was empty prior to the last addition, the producer tries to wake up the consumer.
  178. Unfortunately, the consumer wasn't yet sleeping, and the wakeup call is lost. When the consumer resumes, it goes to sleep and will never be awakened again.
  179. This is because the consumer is only awakened by the producer when itemCount is equal to 1.
  180. The producer will loop until the buffer is full, after which it will also go to sleep.
  181.  
  182. I didn't paste the code because otherwise it'd be way too long, but this is a very fascinating problem that describes how shared mutability can really mess your program.
  183.  
  184. Lastly, there is somethng called data races, where the rust book actually goes a little indepth about it. Here is the source.
  185. https://doc.rust-lang.org/nomicon/races.html
  186.  
  187. I don't think I will go too indepth into it because the answer for this question is already long, but this problem is also an example of shared mutability state.
  188.  
  189.  
  190.  
  191. 8.
  192. slithers(snake).
  193. swims(fish).
  194. swims(shark).
  195. crawl(lizard).
  196. scales(snake).
  197. scales(lizard).
  198. scales(fish).
  199. scales(shark).
  200. flies(bug).
  201.  
  202. % eats(A,B) is true when A eats B
  203. eats(shark, fish).
  204. eats(lizard, bug).
  205. eats(snake, lizard).
  206. eats(fish, fish).
  207. eats(bug, bug).
  208.  
  209. ?- swims(X)
  210. X = fish
  211. X = shark
  212.  
  213. ?- scales(lizard)
  214. YES
  215.  
  216. ?- eats(X, fish)
  217. X = shark
  218. X = fish
  219.  
  220. ?- scales = X, eats(X, Y)
  221. X = snake, Y = lizard
  222. X = lizard, Y = bug
  223. X = Y, Y = fish
  224. X = shark, Y = fish
  225.  
  226. reptile(X) :- scales(X), (slithers(X); crawl(X))
  227.  
  228. Fish(X) :- scales(X), swims(X), eats(X, fish)
  229.  
  230. insect(X) :- flies(X), (eats(reptile(Y), X); (eats(bug, X)))
  231.  
  232.  
  233.  
  234. 9.
  235.  
  236. No Silver Bullet Summary
  237. From a general stand point, Brooks argues that in programming, there is no "silver bullet", i.e, there is nothing that provides an immediate and
  238. extremely effective solution to a given problem or difficulty, especially one that is normally very complex or hard to resolve. In his essay of "No Silver Bullet",
  239. Brooks first talks about the difficulties of building software, categorizing them into either "essential difficulties" and "accidental difficulties".
  240. In short, "essential difficulties" are difficulties that are inherent in nature, and can't be avoided, and "accidental difficulties" are difficulties
  241. are difficulties we intentionally make harder for ourselves. After talking about difficulties, Brooks breaks down the properties of modern software systems
  242. into 4 parts: Complexity, Conformity, Changeability, and Invisibility.
  243.  
  244. "Let us consider the inherent properties of this irreducible essence of modern software systems: complexity, conformity, changeability, and invisibility."
  245.  
  246. Complexity - Scaling up a software increases the number of elements (key is that this increase is non-linear)
  247. Conformity - A programmer cannot have the mindset of "everything has some kind of unifying principle" like you see in mathematics or physics
  248. Changeability - Software will constantly under go changes. Subjects like users, and hardware will force these changes
  249. Invisibility - Software can NOT be visualized, unlike buildings and architectural designs
  250.  
  251. So in short, it looks like these properties are a pain to work with, software gets complex incredibly fast, and uses a lot of computing power, not everything
  252. shares some sort of principle like in mathematics or physics, software is constantly changing, and software can't even be visualized. So what can we do given that
  253. software falls under these 4 hard-to-work-with principles? Well, in Silver Bullet, Brooks offers 3 kinds of solutions to address the problem of software engineering.
  254.  
  255. Reuse - Reusing existing software components means that a majority of your cost is really just going to be developmental, as in all you have to really do is
  256. add on to what is already existing. The cost of building your own solution stands our more than if you were to buy a whole new system.
  257.  
  258. Incremental Development - The biggest problem in building a software is having the urge to build it optimally the first try. However, instead of building an optimally
  259. software in the first place, learn how to build a working one, then optimize it from there.
  260.  
  261.  
  262. Investing in Software Developers - Take the time to nurture and cultivate your own team. I don't think he necessarily means find the best one, but find the ones in
  263. which you can develop and invest into them, in order for them to build better software.
  264.  
  265.  
  266.  
  267. Out of the Tar Pit Summary
  268. From a general stand point, Marks and Mosley argues that complexity is the root of all problems in software engineering. By separating accidental complexity
  269. and essential complexity, there are methods to build better software. When it comes to software engineering, Marks and Mosley argues that there are a variety of
  270. factors that cause complexity. Some of these include specifying control, adding states in stateless procedures, the languages themselves etc. In short, there are too
  271. many things we can most likely control that cause all this complexity.
  272.  
  273. Similar to "No Silver Bullet", Marks and Mosley describe accidental and essential complexity. In short, essential complexity is what must be done minimum, in order
  274. for the program to function at all. In the ideal world, being only concerned with essential complexity is easy. There is no ambiguity, there is no state. However, any means
  275. to attempt to interpret this problem in physical way, like code, creates accidental complexity. Accidental complexity, as described above, is complexity that we
  276. intentionally make for ourselves.
  277.  
  278. Marks and Mosley continues to describe why it might be reasonable to have accidentla complexity to some level (of course you need essential complexity, that is THE TASK)
  279. Some of these reasons include Performance and Ease of Expression. Performance is when the accidental state and control are required for efficiency, and ease of
  280. expression is when the accidental state can sometimes be the easiest way to express some sort of logic. The ideal way to group these together is to imagine that the
  281. essential complexity is the core of the system, it's more like an "idea", and have essential logic and accidental state and control support this idea of
  282. essential complexity. Essential complexity is the problem, it's an "idea", anything used to try to represent this kind of idea through higher level languages, or code
  283. is accidental complexity.
  284.  
  285.  
  286.  
  287.  
  288.  
  289. Personally, I would say I have to side more with Marks and Mosley when it comes to software engineering. Although both of them do share some points, Out of the Tar Pit
  290. seems to be more relevant than No Silver Bullet. I have to agree that complexity is the root of all our problems, because I believe as software engineering progresses more
  291. with time, bigger, and better tools will be developed to help us solve some problems that "No Silver Bullet" has mentioned. One of these include invisibility. I believe
  292. that in the far future, someone will develop a high level language that allows us to see what's happening on the inside of a software. Yes, Brooks might be right that
  293. software is invisible, but I don't think it's so invisible that there is no way we can create our own way of interpreting what it really is.
  294.  
  295. No Silver Bullet
  296. "Software is invisible and unvisualizable"
  297.  
  298. When it comes to discussing Accidental Complexity and Essential Complexity, it seems like that is the proper way to embody
  299. software engineering. Now when it came to discussing what they really were and the main focuses in both essays, Marks and Mosley seems to have been able to describe
  300. accidental and essential complexity in a way that makes software engineering a lot more of an open problem than just a typical problem, which I 100% believe is the case
  301. when it comes to software engineering.
  302.  
  303. No Silver Bullet
  304. "The essence of a software entity is a construct of interlocking concepts: data sets, relationships among data items, algorithms, and invocations of functions"
  305.  
  306. Out of the Tar Pit
  307. "Note that the definition of essential is deliberately more strict than common usage. Specifically when we use the term essential we will mean strictly essential to the users’
  308. problem (as opposed to – perhaps – essential to some specific, implemented system, or even – essential to software in general)"
  309.  
  310. When both essays tried to depict what essential complexty was, No Silver Bullet seemed to view it as a mechanism that connects things such as data sets, algorithms and
  311. much more as explained in the quote. However, Marks and Mosley seems to take it one step further, and seem to have depict Essential Complexity as this formless entity,
  312. this idea, that is truly the core of our problem. I don't quite agree when Silver Bullet says it's some connection, but I agree when Tar Pit says it's the core of your problem
  313. because the essential complexity IS your problem. I know it's probably wrong to think about it in terms of other subjects, but from what I see, think of a word problem, and
  314. reinterpreting it into a math equation. In this case, the essential complexity would probably be that math equation itself. Anything outside of this math equation, especially in
  315. a word problem, means absolutely nothing. "Jim buys 40 watermelons, and eats 20 of them, how many do we have left?" The essential complexity of this problem boils down
  316. to a very simplistic mathematical equation. However, I think this is kind of a bad example, because in software engineering, essential complexity is almost the same, except
  317. you can't really express it. Any means to express it means you're adding on accidental complexity. (Not gonna lie I think I'm sounding a bit crazy).
  318.  
  319. When it comes to looking at accidental complexity, I have to again, agree with Marks and Mosley. Again, I believe Out of the Tar Pit depicts accidental complexity a lot
  320. better than No Silver Bullet does, but I think this is because Marks and Mosley's essays mainly revolve around complexity anyways, which I believe is the root of software
  321. engineering. It seems like Brooks doesn't talk much about accidental complexity, but something I probably agree with him is that some things, such as higher level langauges,
  322. can help us get rid of some accidental complexities.
  323.  
  324. No Silver Bullet
  325. "The power of such systems does not come from ever-fancier inference mechanisms, but rather from ever-richer knowledge bases that reflect the real world more accurately.
  326. I believe that the most important advance offered by the technology is the separation of the application complexity from the program itslf"
  327.  
  328. When it comes to tackling these software engineering problems, I must also say that Marks and Mosley make a fair point over Brooks. Brooks does make a fair stand on
  329. the problems we have with software engineering, but I blieve Marks and Mosley breaks it down to the core, where it changes how you see the problem, and how you solve it.
  330.  
  331. Out of the Tar Pit
  332. "It is our belief that the vast majority of state (as encountered in typical contemporary systems) simply isn’t needed (in this ideal world).
  333. Because of this, and the huge complexity which state can cause, the ideal world removes all non-essential state"
  334.  
  335. When you strip a problem of all it's accidentla complexity, expression through code, higher level languages, anything that has nothing to do with the problem, and you
  336. only look at the problem, and the problem itself, that's when you reach true essential complexity, and in an ideal world, the essential complexity part is just the problem itself.
  337. In an ideal world, a lot of the things used to express this problem wouldn't be needed. In short, it's just an idea floating.
  338.  
  339. On top of that, I really like how Marks and Mosley logically defines accidental complexity after talking about essential complexity.
  340.  
  341.  
  342. Out of the Tar Pit
  343. "We believe that – despite the existence of required accidental complexity – it is possible to retain most of the simplicity of the ideal
  344. world in the real one. We now look at how this might be achievable. Our recommendations for dealing with complexity (as exemplified by both state and control)
  345. can be summed up as avoid, and separate."
  346.  
  347. In short, some things that really can't be avoided, which I agree to be true, is performance and ease of expression. These things are usually in the form of higher level
  348. languages, logic expression etc etc. As much as essential complexity is the problem itself, we can choose to represent this problem in a readable manner to us humans,
  349. through these accidental complexities.
  350.  
  351.  
  352. Sorry if my argument was all over the place.
  353.  
  354.  
  355.  
  356. 10.
  357. I'm going to frame my vision in the context of PL as communication, i.e in the idea of those who know how to code, and those who don't know how to code, and what impact
  358. this may have on those who do NOT code, because in this time of era, not knowing how to code is a huge disadvantage
  359.  
  360. I feel like at this point, a lot of the things we do are either completely new, or for ease of use/access. For example, look at a lot of our technology today. When
  361. computers were first introduced, it was a literal brick that had a terminal, it was a place to store research (if I remember correctly). Now going to the present, a computers
  362. can almost do whatever that person wants it to do technology wise. Similarly, in the context of programming languages, we've had languages like C, where a beginner would
  363. probably struggle to figure out what really is going on, compared to recennt languages such as Rust. It seems clear that the direction programming languages seem to be heading
  364. in the context of communication is for readability, simplicity, and ease of use. I mean, take a look at Dark for example, a very new language at the talk for web development.
  365. Look at how easy it looks to create some sort of website through blocks, it's almost like an art where the difficulty of picking it up is probably a LOT LOT easier than
  366. javascript. Additionally, look at the programming language our professor created, Mech. There seems to be very nice looking visualizations, tables, diagrams etc. It seems
  367. like the programming languages that are being developed in the next 10 years are merely for ease of use, and access to everyone, most likely making it an incredibly easy
  368. thing to pick up, and a basic skill in the technological era. Reserachers, and typical people will create these new languages, and they will succeed in creating them, making it
  369. easier for everyone to pick coding up. I think everybody is mildly frustrated, including me, at how old languages are sometimes incredibly cryptic, and how you have to
  370. manage everything on your own like pointers, check for safety, compiler errors and stuff like that. The shift from old language to new language is most definitely going to be a
  371. convenience shift, is what i like to call. As of right now, programmers worldwide seem to be in this state of transition almost, a lot of us are still coding in old languages
  372. like C/C++, at the same time, looking to learn other new languages and realize the potential it could have when developing programs. In short, I believe languages
  373. are not going to develop much in the functionality part, but more in the interface part, making code a lot easier to pick up, a lot safer to write, and a lot simpler to program.
  374.  
  375.  
  376.  
  377. Citations
  378.  
  379. https://stackoverflow.com/questions/33923/what-is-tail-recursion
  380.  
  381. https://gist.github.com/t-nissie/479f0f16966925fa29ea
  382.  
  383. https://manishearth.github.io/blog/2015/05/17/the-problem-with-shared-mutability/
  384.  
  385. https://en.wikipedia.org/wiki/Producer%E2%80%93consumer_problem
  386.  
  387. https://doc.rust-lang.org/nomicon/races.html
  388.  
  389. https://blog.acolyer.org/2015/03/20/out-of-the-tar-pit/
  390.  
  391. https://blog.acolyer.org/2015/03/20/out-of-the-tar-pit/
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement