Difference between revisions of "Projects Under Review"

From edegan.com
Jump to navigation Jump to search
 
(15 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
Project to be checked for code, etc:
 
Project to be checked for code, etc:
*[[LinkedIn Crawler (Python)]]
+
*[[LinkedIn Crawler (Python)]] (crunchbase_founders.py actually located E:\McNair\Projects\Accelerators\Spring 2017\crunchbase_founders.py)
*[[Mapping on R]]
+
*[[Mapping on R]] (need to download)
*[[SDC Normalizer]](this should be fine; can't find the hubs file that he ran it on)
+
*[[SDC Normalizer]] (this should be fine; can't find the hubs file that he ran it on)
*[[Start Up Address Finder Algorithm (Tool)]]
+
*[[Start Up Address Finder Algorithm (Tool)]] (algorithm section not complete; also don't know where the script is located)
*[[The Matcher (Tool)]]
+
*[[The Matcher (Tool)]] (works)
*[[Twitter Follower Finder (Tool)]]
+
*[[Twitter Follower Finder (Tool)]] (not sure how to get the database/run this)
*[[Twitter News Finder (Tool)]]
+
*[[Twitter News Finder (Tool)]] (don't know where this is since location is not specified or how to run it since there are no instructions; can't find it in McNair/Projects/TwitterCrawler if it is in there)
*[[Twitter Webcrawler (Tool)]]
+
*[[Twitter Webcrawler (Tool)]] (not sure how to get the database/run this)
*[[URL Finder (Tool)]]
+
*[[URL Finder (Tool)]]  
*[[Whois Parser]]
+
(1. IOError: File E:\McNair\Projects\Accelerators\cohorts.csv does not exist bc Accelerators folder have been restructured. However, can't find the cohorts.csv in those folders either.
*[[Moroccan Parliament Web Crawler]]
+
 
 +
2. IOError: File E:\McNair\Software\Scripts\URLFindersCopyabouttest.cvs does not exist.
 +
 
 +
3. works, had to change out_path in glink and path1read,path1write, and out_path in URL Compiler.py.
 +
 
 +
4. the input and output file areas changed, now in the \McNair\Projects\Hubs\summer 2016\Searching folder.)
 +
*[[Whois Parser]] (works when testing on [[Industry Classifier]])
 +
*[[Moroccan Parliament Web Crawler]] (need to install scrapy?)
 
*[[Eventbrite Webcrawler (Tool)]]
 
*[[Eventbrite Webcrawler (Tool)]]
*[[Govtrack Webcrawler (Tool)]]
+
*[[Govtrack Webcrawler (Tool)]] (pretty sure it works, but the site that it runs on http://www.edegan.com/wiki/index.php doesn't seem to work "edegan.com’s server DNS address could not be found.")
*[[Industry Classifier]]
+
*[[Industry Classifier]] (works but the project page is inaccurate. Location of the file is actually McNair/Projects/Accelerators/Spring 2017/Industry_Classifier
*[[Interactive Maps - The Whole Process]]
+
*[[Interactive Maps - The Whole Process]] (probably does work but when I try to use the WhoIsParser on it, the output file is empty?)
*[[Google Scholar Crawler]]
+
*[[Google Scholar Crawler]] (works, however the script is actually named crawler.py and it's in the Google_Scholar_Crawler file in E:\McNair\Software. The script scholar.py only exists in his github)
*[[Collecting SBIR Data]]
+
*[[Collecting SBIR Data]] (script concat_excel.py is actually under E:\McNair\Projects\SBIR\Data\Aggregate SBIR\concat_excel.py instead of E:\McNair\Projects\SBIR\concat_excel.py)
  
 
The pages below need to be made into project pages, or have content extracted and then turned into a project page
 
The pages below need to be made into project pages, or have content extracted and then turned into a project page

Latest revision as of 15:53, 30 November 2017

Project to be checked for code, etc:

(1. IOError: File E:\McNair\Projects\Accelerators\cohorts.csv does not exist bc Accelerators folder have been restructured. However, can't find the cohorts.csv in those folders either.

2. IOError: File E:\McNair\Software\Scripts\URLFindersCopyabouttest.cvs does not exist.

3. works, had to change out_path in glink and path1read,path1write, and out_path in URL Compiler.py.

4. the input and output file areas changed, now in the \McNair\Projects\Hubs\summer 2016\Searching folder.)

The pages below need to be made into project pages, or have content extracted and then turned into a project page