Advertisement
Guest User

Untitled

a guest
Jun 24th, 2017
64
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Ruby 0.77 KB | None | 0 0
  1. =begin
  2. This is an attempt to write a spider
  3. start off with requiring gems. then parse a html file
  4. by connecting to a server and using nokogiri to parse the html
  5. into a variable!
  6. =end
  7.  
  8. require 'rubygems'
  9. require 'nokogiri'
  10. require 'open-uri'
  11.  
  12. =begin
  13. create a method to accept a link from the user and return how many
  14. links are present in the url given
  15. =end
  16.  
  17. def howManyLinks()
  18.     #local links array
  19.     links =[]
  20.     #get the name of link you wanna crawl from the user
  21.     link= gets.chomp
  22.     #use nokogiri to parse the html from the uri
  23.     doc = Nokogiri::HTML(open(link))
  24.     #find all the links in the uri and add them to the local array
  25.     #links!
  26.     doc.css("a").each do |l|
  27.         links.push(l.content)
  28.     end
  29.     #print the outcome
  30.     puts "The spider found #{links.count} links"
  31. end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement