{{Team Member|Has team sponsor=McNair StaffCenter|Went to school=Rice|Has team position=Student|Has job title=Tech Team|Has name=Peter Jalbert,|Has headshot=peter_headshot.jpg|Has or doing degree=BA,Bachelor|Has academic major=Computer Science, ; Music Performance,
Peter is currently a junior at Rice University, pursuing a double major in Computer Science and Music. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014.
* [http://mcnair.bakerinstitute.org/wiki/Google_Scholar_Crawler Google Scholar Crawler]
==Looking for Code?==
==Early Life=Demo Day Crawler===Peter was born in Philadelphia, PA to Julia and Pierre Jalbert in the middle of his father's interview for a position at Rice University. Days after, the Jalbert family moved to Houston, TX, where they still reside to this day.Term: Fall 2017
Crawls Google search to find candidate web pages for accelerator companies' demo days. E:\McNair\Software\Accelerators\DemoDayCrawler.py ==Education=Demo Day Hits===Peter is currently Term: Fall 2017 Analyzes the results of a junior at Rice University, pursuing demo day crawl for hits of keywords. E:\McNair\Software\Accelerators\DemoDayHits.py ===HTML to Text===Term: Fall 2017 Converts a folder of HTML files to a folder of TXT files. E:\McNair\Software\Accelerators\htmlToText.py ===Tiger Geocoder===Term: Fall 2017 Installed a dual degree in Computer Science psql extension that allows for internal geocoding of addresses. psql geocoder ===Yelp Crawler===Term: Fall 2017 Usage: Crawls for data on restaurants and Clarinet Performancecoffeeshops within the 610 Loop. Part of the Houston Innovation District Project. E:\McNair\Software\YelpCrawler\yelp_crawl.py ===Accelerator Founders===Term: Fall 2017 Usage: Uses the LinkedIn Crawler along with the Crunchbase founders data to retrieve information on accelerator founders. E:\McNair\Projects\LinkedIn Crawler\LinkedIn_Crawler\linkedin\linkedin_founders. Peter graduated Salutatorian from py === Crunchbase Founders ===Term: Fall 2017 Usage: Queries the Crunchbase API to get names of accelerator founders. E:\McNair\Projects\Accelerators\crunchbase_founders.py ===LinkedIn Crawler===Term: Spring 2017 Usage: Crawls LinkedIn to obtain relevant information. E:\McNair\Projects\LinkedIn Crawler\web_crawler\linkedin\run_linkedin_recruiter.py ===Draw Enclosing Circles===Term: Spring 2017 Usage: Draws the outcome of the High School Enclosing Circle Algorithm on a particular city to a google map HTML output. E:\McNair\Projects\Accelerators\Enclosing_Circle\draw_vc_circles.py ===Enclosing Circle for VCs ===Term: Spring 2017 Usage: Uses the Performing Enclosing Circle algorithm to find concentrations of VCs. E:\McNair\Projects\Accelerators\Enclosing_Circle\vc_circles.py === Industry Classifier ===Term: Spring 2017 Usage: Neural Net that predicts a companies industry classification. E:\McNair\Projects\Accelerators\Code+Final_Data\ChristyCode\IndustryClassifier.py ===WayBack Machine Parser===Term: Spring 2017 Usage: Uses the WayBack Machine API to retrieve timestamps for URLs. E:\McNair\Projects\Accelerators\Spring 2017\Code+Final_Data\wayback_machine.py ===Accelerator Address Geolocation===Term: Spring 2017 Usage: Used to find latitude and Visual Arts longitude points for all accelerators in Houstonthe accelerator data files. E:\McNair\Projects\Accelerators\Code+Final_Data\process_locations.py ===Accelerator Data Parser===Term: Spring 2017 Usage: Used to parse the data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project]. E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py ===Cohort Data Parser===Term: Spring 2017 Usage: Used to parse cohort data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project]. E:\McNair\Projects\Accelerators\Code+Final_Data\parse_cohort_data.py ===Google SiteSearch===Term: Spring 2017 Usage: Preliminary stage project intended to find an accurate web site for an unlisted company web address by using Google Search. E:\McNair\Projects\Accelerators\Google_SiteSearch\sitesearch.py ===F6S Crawler===Term: Fall 2016 Usage: Used to download html files containing accelerator information from the F6S website. E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_crawler_gentle.py ===F6S Parser===Term: Fall 2016 Usage: Used to parse the html files downloaded by the F6S crawler to create a list of accelerators. E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_parser.py ===Executive Order Crawler===Term: Fall 2016 Usage: Used to download executive orders. NOTE: uses scrapy format, TX run differently from regular python programs. E:\McNair\Projects\Executive_order_crawler\executive ===Kuwait Web Driver===Term: Fall 2016 Usage: Used to download csvs of bills and questions from the Kuwait Government Website. Uses Selenium. All scripts in 2014the folder do similar things. E:\McNair\Projects\Middle East Studies Web Drivers\Kuwait ===Moroccan Web Driver===Term: Fall 2016
==Title 3==Insert Text HereUsage: Used to download pdfs of bills and questions from the Moroccan Government Website. Uses Selenium. All scripts in the folder do similar things.
==Title 4==Insert Text Here E:\McNair\Projects\Middle East Studies Web Drivers\Morocco\Moroccan Bills