Advertisement
collinsanele

Web Scraping For Intermediate/Advanced Coders

Jul 1st, 2019
1,253
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 16.89 KB | None | 0 0
  1. WEB SCRAPING PROJECTS
  2.  
  3. PROJECT 1
  4. I'm looking for someone who specializes in building emailing lists, specifically we would like to build a list of Australian Architects, Design Architects and Interior Designers/planners located in states SA and ACT Australia. This is one website you can find those. [login to view URL] Also you can you yellow pages too. If any thing else that find too. In one company have several architects. ...
  5. ------------------------------------------------------------PROJECT 2
  6. The job is to manually get data from a website and enter it in a Google Sheet document, according to this spec:
  7.  
  8. [login to view URL]
  9.  
  10. Please share a price quote and a time estimation in your offer.
  11. Also, the job requires a 5 minutes update meeting with me every day.
  12. ------------------------------------------------------------PROJECT 3
  13. I'm looking for a data entry person to source email addresses from prescribed directories for the Australian Marine and Maritime industries.
  14.  
  15. I am seeking 1000 contacts to begin with - It' a straight forward email list requiring accuracy for location, type of business, name of business and email address.
  16.  
  17. Please price for 1000 contacts - the research is easy and it's a simple few ...
  18. ------------------------------------------------------------PROJECT 4
  19. I am looking for expert XAMPP. You need to have experience in SSL certification installation.
  20. ------------------------------------------------------------PROJECT 5
  21. Small CRM application that consists about 5 tables and less than 10 screens for data input.
  22. ------------------------------------------------------------PROJECT 6
  23. Hello.
  24.  
  25. I have a list of many telegram users in a CSV file.
  26.  
  27. add them to a group of telegrams,
  28. I would like to post an LP on the website.
  29.  
  30.  
  31. If you can add a group of users of the telegram at once,
  32. If you have such kind of software, please let me know.
  33.  
  34. I think the due date is about 3 days from today.
  35.  
  36. Thank you.
  37. ------------------------------------------------------------PROJECT 7
  38. It can be made using Python along with PHP (for WordPress part). The functionality will be
  39.  
  40. 1. Scraping every kind of app from Google Play Store (using a keyword or category-wise) to an Excel format.
  41. 2. Auto Publishing the app data (Description, Image) to WordPress (as a post) from Play Store API or manually inputting the Excel Data from Step 1 to create WordPress post.
  42. 3. Google Play Store...
  43. ------------------------------------------------------------PROJECT 8
  44. I'm a Ph.D. scholar at IITR. I'm trying to build up a server and for that, I require some amount of data. So, I have already generated a good amount of data but some more entries need to be added. Furthermore, I need a person who is skilled enough to do the scraping of the pdf file.
  45. ------------------------------------------------------------PROJECT 9
  46. I have a database of more than 1.3 Million leads from Egypt. Contact details available Name, Address, email, Phone number, home phone.
  47. 75% of contacts are property owners.
  48. Property Owners
  49. Bank account holders
  50. Cars Owners categorized with cars manufacturers
  51. many more categories available
  52.  
  53. I am looking for someone who wants to buy this database or help me to find a buyer.
  54. This is a gold mine for Re...
  55. ------------------------------------------------------------PROJECT 10
  56. This scripe can scrape username from any reference group and it will be responsible to add a bunch of usernames into telegram groups automatically.
  57. ------------------------------------------------------------PROJECT 11
  58. Hello :) Thank you for having interest in this topic.
  59.  
  60. **
  61. PLEASE, PLEASE READ THE PROJECT DESCRIPTION FIRST,
  62. AND SEND THE PROPOSAL AFTER READING IT.
  63. IF YOU DO BOT-PASTE PROPOSAL SENDING, I WILL INSTANTLY DUMP IT INTO TRASH CAN.
  64. PLEASE DON'T MAKE BOTH OF OUR PRECIOUS TIME TO BE WASTE.
  65. **
  66.  
  67. I'm trying to have a data of previously traded chart from Tradingview.
  68.  
  69. When it means scraping ...
  70. ------------------------------------------------------------PROJECT 12
  71. I have a [login to view URL] it looks [login to view URL] I want to create wonderful catalog and upload my products to my website.
  72. If you are interested in my project,please bid
  73. Thanks
  74. ------------------------------------------------------------PROJECT 13
  75. web researcher to gather do a lead generation task which includes creating a spreadsheet and listing company information
  76. ------------------------------------------------------------PROJECT 14
  77. I have list of the products that I want to get pulled out from other websites and make a CSV file of it please make sure it's the right CSV format that magento will accept it and please upload it to the our magento 2 store and get it done correctly.
  78.  
  79. I need a real magento expert and a person who has experienced on accurate web scraping, magento csv file creation. I need this one to get it d...
  80. ------------------------------------------------------------PROJECT 15
  81. I have a list of items that need additional information. I have a website that provides the set information. Need to combine information. Website is HTML and I have list of pages already. Very simple, no need for massive/exotic/specialty work.
  82. ------------------------------------------------------------PROJECT 16
  83. Python Web Scrapper Specialist
  84. Expert knowledge of Scrapy, Beautiful Soup, lxml, selenium, and proxy rotation libraries.
  85. ------------------------------------------------------------PROJECT 17
  86. Please scrape the list of emails Message me for Log-in credentials. -Once you get into the sight click on advanced (near the top right) - Under the Advisor section, check mark Investment Bank, M&A Advisory Firm and Business Broker and click search companies (Blue button). Make sure people are selected and NOT companies-There should be around 5500 companies with names listed on the rightWe want...
  87. ------------------------------------------------------------PROJECT 18
  88. I need a simple python script that crawls a website and downloads a large number of PDF files. The PDF files are located on a number of different sub-pages that need to be crawled over.
  89. ------------------------------------------------------------PROJECT 19
  90. Hi,
  91.  
  92. I want the 2010 - 2019 results from the following site scraped and stored in an Excel file;
  93. [login to view URL]
  94.  
  95. I want it in this format in the file;
  96. Date, Time, HomeTeam, AwayTeam, Result, Odds1, Odds2, Odds3 (all of these found from the main result page) and then I want the Correct Score information added (you fint them by clicking the match, choosing CS (correct score). Also highlight the...
  97. ------------------------------------------------------------PROJECT 20
  98. I'm looking for solid email list from Zoominfo
  99. ------------------------------------------------------------PROJECT 21
  100. Hello
  101. I want to replicate a web site, I want to scrape the web site and download all content, the pictures will have to be cropped as there is a label.
  102. Please bid only if you have 5 stars and 100 + Jobs
  103. ------------------------------------------------------------PROJECT 22
  104. I need someone to make [login to view URL] script to work on site. includes download and configuring.
  105. ------------------------------------------------------------PROJECT 23
  106. Hello
  107.  
  108. Here are the instructions to complete the task, I am attaching an image as well that can help you.
  109.  
  110. Instructions
  111.  
  112. First, you must enter this site:
  113.  
  114. [login to view URL]
  115.  
  116. On this page, please download the folders starting from January 2014 until May 2019. This will require changing the month and the date accordingly. For the months before January 2016, you will have to edit the li...
  117. ------------------------------------------------------------PROJECT 24
  118. Fast Serious Google Researcher [login to view URL] time waster Apply. Only serious people and full time worker need to apply. JOB- Search given US Lawyer contact details if you do this night need apply .
  119. ------------------------------------------------------------PROJECT 25
  120. i need to develop a web app that can approve members to a group automatically. my website is [login to view URL] and i have so many Linkedin Groups and each of them contains a lot of pending group members. i need to develop a script of software that can automatically approve group members when they fill a form. the software needs approve members only when they fill that form. so the software also...
  121. ------------------------------------------------------------PROJECT 26
  122. I have started this project with selenium, ChromeDriver, and python, but I've run into a number of issues that go over my head. For some reason, when I am in say Google Chrome, and put in a url, the way that flights works is that it loads an initial set of deals and then continues to update with cheaper prices. For whatever reason, my code is not able to even load the latest results. My [logi...
  123. ------------------------------------------------------------PROJECT 27
  124. i need to develop a web app that can approve members to a group automatically.
  125. ------------------------------------------------------------PROJECT 28
  126. I'm looking for someone that can write a script that can be used to create Data Center proxies for various regions.
  127. ------------------------------------------------------------PROJECT 29
  128. I have roll numbers of candidates around 6000 NOS
  129. There is a website where roll number need to be entered with their date of birth.
  130. Date of birth is between range of 5 years.
  131. I need rank and marks respective to all roll numbers in a excel file arranged in a descending order.
  132. Thanks.
  133. ------------------------------------------------------------PROJECT 30
  134. An Excel spreadsheet with details of the travel I have done since February.
  135.  
  136. I will go through my inbox and forward you all the airline, coach etc bookings that I have made since February. You will go through these emails and put the basic details (date of travel, start, destination, cost) into a spreadsheet and send that back to me. There will be fewer than 100 emails.
  137. ------------------------------------------------------------PROJECT 31
  138. Hello
  139.  
  140. There are around 1200 listings in the below links:
  141.  
  142. [login to view URL]
  143.  
  144. We need to capture all this data in an excel with columns:
  145.  
  146. Name
  147. Address
  148. Phone
  149. Email
  150. Website
  151.  
  152. Thank you
  153. ------------------------------------------------------------PROJECT 32
  154. Scrape details from Amazon products,
  155. Like price, Description, Quantity, Description, Features, Etc.
  156. Bonus:
  157. [login to view URL] a teacher and show me how it's done too.
  158. 2. If you can grant API access to Amazon
  159. ------------------------------------------------------------PROJECT 33
  160. i need good web scraper developers preferably teams or agencies but individuals are welcome to apply. please feel free to bid.
  161. ------------------------------------------------------------PROJECT 34
  162. Hi,
  163.  
  164. I need a person who can provide me a tool which reads the given URLs and provide me data with
  165. URL COMPANY, WEBSITE, E-MAIL and PHONE NUMBER.
  166. I will share you the same URL to the right candidate.
  167. This is 1 or 2 hour job to a right candidate.
  168. My budget is $10 to $20.
  169.  
  170. Note: All info is present on the given URLs.
  171.  
  172. Thanks
  173. Sumit
  174. SaS Technologies
  175. ------------------------------------------------------------PROJECT 35
  176. I want to extract data through node js and tasks are below,
  177. * Find all links by the google search result (Ex: If I search websosite on google, all searched google links should be returned )
  178. * Find all emails, phone numbers, and addresses on each web page.
  179. ------------------------------------------------------------PROJECT 36
  180. i need good web scraper developers preferably teams or agencies but individuals are welcome to apply. please feel free to bid.
  181. ------------------------------------------------------------PROJECT 37
  182. * Topic: Multiprocessing/Multithreading scrapy application with proxy
  183.  
  184. * Expectation: Please only bid if you had project experience in this domain. No students PLEASE. Expert level (300-400) conversation / consultation / demo for 2 hours. Architecture diagram deliverables and component demo code for proof of concept (Not production ready code).
  185.  
  186. * Project background: we have millions of tasks s...
  187. ------------------------------------------------------------PROJECT 38
  188. I need you to develop some software for me. I would like this software to be developed for Windows using Python.
  189. ------------------------------------------------------------PROJECT 39
  190. Hi, We need a person who collect information from website and input onto excel. We provide the website list. You need to collect company name, Business address, Email and phone number.
  191. ------------------------------------------------------------PROJECT 40
  192. We are in the process of growing our recipe database in My Online Gym. We are looking fore someone to source and then provide data entry to put healthy recipes in the database. We also would expect the person to provide high quality images for each recipe. These images must not be taken inappropriately from others work.
  193. ------------------------------------------------------------PROJECT 41
  194. I have a chrome extension that can automatically follow people and scrapping followers and following,, however I need it to have more feature
  195. 1. Unfollow Mutual Follower (People I follow who follow back my account)
  196. 2. Unfollow Nonfollowers (People I follow who don’t follow back my account)
  197. 3. Unfollow based on excel sheet (sheet 4)
  198. 4. Edit the existing feature to store user instagram descrip...
  199. ------------------------------------------------------------PROJECT 42
  200. Language: PHP, NO FRAMEWORKS
  201. Database: mysql
  202.  
  203. API used: [login to view URL]
  204.  
  205. Table A (named rfpsites_mainlist)
  206. name varchar
  207. state (varchar) 2 letter us state code
  208. type varchar(city, county, school district, university, college)
  209.  
  210. Table B (named serpurl)
  211. id (int auto increment)
  212. name varchar
  213. url
  214. state (varchar) 2 letter us state code
  215. type varchar(city, county, school district, university, college)
  216. c...
  217. ------------------------------------------------------------PROJECT 43
  218. Creating a spreadsheet of contacts from an online public website:
  219.  
  220. - First name
  221. - Last name
  222. - Company
  223. - Address
  224. - Telephone
  225. - Email
  226.  
  227. Spreadsheet template and brief to be sent on hiring with url and search criteria.
  228.  
  229. 1,000s of records to be created.
  230.  
  231. Immediate start. May be other database-related work available.
  232. ------------------------------------------------------------PROJECT 44
  233. Hello,
  234.  
  235. We need a good data scrapper to get us the emails of all the Australia bound Tours and Travelers from below listed websites.
  236.  
  237. Please note that these sites don't display the emails/phones directly, but you need to be the code expert to get the crack of code/page sources or use any other way to get us the emails of listings.
  238.  
  239. As most of the times, same business is listed multiple times...
  240. ------------------------------------------------------------PROJECT 45
  241. We would like to gain the contact information from eBay sellers on the UK site and then compare every week to see if they are still sellers, if they are not, we would like those details to be highlighted.
  242.  
  243. This would need to be a cloud based system which will automatically search for eBay sellers that have been banned or removed from their system, if they have, we would like the email/contact info...
  244. ------------------------------------------------------------PROJECT 46
  245. I need scraping or Crowl products data products from price comparison website and update the price and stock availability every 24 hours data is big maybe more than 500k products for this case i need who can building visual cloud automation for us in python or we can hire someone can do that for us daily with him server
  246. ------------------------------------------------------------PROJECT 47
  247. I have a plugin that I have used that uses XPath to pull prices via a cronjob in specified intervals. The plugin still pulls prices from some stores but not others. I am looking for someone to fix this issue for me. I would also like assistance in getting the proper XPath such as (//*[@itemprop="price"]) from several Shopify stores I am scraping pricing updates from. I am attaching the p...
  248. ------------------------------------------------------------PROJECT 48
  249. I want to someone to build a user friendly price aggregation website which will crawl 10 major market place.
  250. ------------------------------------------------------------PROJECT 49
  251. If anyone wants sports betting odds comparations website, please contact me, I have source code and data source feed already. Like [login to view URL]
  252. ------------------------------------------------------------PROJECT 50
  253. Good morning,
  254.  
  255. we are looking for someone that can scrape the address of the Top 300 hotels in Germany and put it into an excel sheet.
  256.  
  257. We need the name of the owner and the address (all shown on the website under "Impressum", which is a law in Germany).
  258.  
  259. Please make a proposal for it.
  260.  
  261. You would go through these hotels and scrape the details from their websites:
  262. [login to view URL]...
  263. ------------------------------------------------------------Source: https://www.freelancer.in/jobs/web-scraping/
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement