The Impact of Entrepreneurship Hubs on Urban Venture Capital Investment

From edegan.com
Revision as of 11:27, 19 July 2016 by GunnyLiu (talk | contribs)
Jump to navigation Jump to search


McNair Project
The Impact of Entrepreneurship Hubs on Urban Venture Capital Investment
Project logo 02.png
Project Information
Project Title
Start Date
Deadline
Primary Billing
Notes
Has project status
Copyright © 2016 edegan.com. All Rights Reserved.


Abstract

The Hubs Research Project is a full-length academic paper analyzing the effectiveness of "hubs", a component of the entrepreneurship ecosystem, in the advancement and growth of entrepreneurial success in a metropolitan area.

This research will primarily be focused on large and mid-sized Metropolitan Statistical Areas (MSAs), as that is where the greater majority of Venture Capital funding is located.

A general overview of entrepreneurial ecosystems can be found here: Entrepreneurial Ecosystem.


Current Work

General Overview

Currently there are two major tasks being performed (list to be updated):

  1. Creation of VC data table: UPDATE: Complete (see completed work)
  2. Creation of Hubs Dataset: UPDATE: See current work in progress for updates As of Spring 2016, a list of potential Hubs with a set of characteristics was created. Many of these are not what will be defined as Hubs. We will be creating a scorecard to help subjectively define Hubs based on certain characteristics. To do so:
    1. We will determine variables we would like to use for scorecard
    2. Create a process via Mechanical Turk to streamline the updating of the list
  3. Hazard Rate Model: UPDATE: (7/11) Spoke to Xun Tang, econometrics professor in Rice's Economics Department, and now looking for appropriate proportional rate hazard models with time varying covariates. In order to perform our diff-diff model, we need to match MSAs. In order to do so, we will be using a hazard rate model to produce a probability that a MSA gets a Hub and compare MSAs that do and don't have hubs with similar probabilities.

Work In Progress

Hubs Data

(Week of 7/11)

1) We published the twitter count on mechanical turk and received results.

2) We have audited the results and updated the amazon.

3) We are creating additional potential turks on the amazon site (See Hubs: Mechanical turk)

4) We are finding more potential hubs from members of international national business innovation association


(Week of 7/4)

1) We have created the list and commented our thoughts after ---. For determining the variables, we have separated the list into two parts: a list of desired variables and ones that were previously collected, many of which are desired variables.

2) We have also created an example of how to write mechanical turks for collecting certain variables


Sample List of Hubs

  1. Capital Factory, Austin
  2. 1871, Chicago
  3. Rocket Space, San Francisco
  4. 1776, Washington D.C.
  5. Betamore, Baltimore
  6. Packard Place, Charlotte
  7. The venture Center, Little Rock
  8. GSV Labs, San Francisco
  9. The Hive, Palo Alto
  10. Innovation Pavilion, Denver
  11. OSC Tech Lab, Akron
  12. Speakeasy, Indianapolis
  13. Riverside.io, Riverside
  14. The Salt Mines, Columbus
  15. InNEVation, Las Vegas
  16. 804 RVA
  17. Impact Hub, Salt Lake
  18. Awesome Inc, Louisville
  19. Geekdom, San Antonio
  20. Alloy26, Pittsburg
  21. ReSET, Hartford
  22. Ansir Innovation Center, San Diego
  23. Domistation, Tallahassee
  24. Atlanta Tech Village, Atlanta
  25. Spark Labs, New York

Variables for Hubs

We will be creating a "Hubs scorecard" to determine how hub-like potential spaces are. In order to do so, we will evaluate the places based on certain variables. Previous variables for potential hubs were collected. Below, we list those as well as other variables we think might be helpful to build out the scorecard.

Ideally, we would have the following variables (not collected previously):

  • Onsite VC/Angel/Investors (Count or binary)
  • Onsite Mentors (binary) --- Are these the same as advisers?
  • "Office hours" with investors or mentors (binary) --- note: previously collected included number of events, but did not separate them into categories (e.g. networking events, workshops, etc.). We view this separation as important
  • Onsite temporary workshops (binary or count) *** see mechanical turk
  • Networking Meetups (Binary or count) *** see mechanical turk
  • Sponsors and Partners (binary and list) --- are these the same?
  • Alumni Network (binary) --- not all potential hubslist this and the fact that some do might indicate its importance
  • Num of Companies --- to help determine size as getting physical sqfootage is difficult
  • Nonprofit (binary) --- helpful in determining goals of potential hubs
  • Mission Includes Key Buzzwords (e.g. "ecosystem", "community") --- help separate simple coworking spaces form hubs


Group by difficulty level for Turks:

  • Easy
    • Twitter Activity
    • Non-profit
    • Mission Includes Key Buzzwords
  • Moderate
    • Number of Companies
    • Sponsors and Partners
    • Number of Events (including onsite temporary workshops and networking meetups)
    • Onsite Mentors
  • Hard
    • “Office hours” with investors or mentors
    • Onsite VC/Angel/Invesotors
    • Temporary Workshops
    • Networking Meetups


Example of Prior Variables Collected:

  • Specific Industry -- defined as LinkedIN Self Identifier, no categories just plain text. We think what we really want is to see if they have a specialty (e.g. healthcare)
  • Num of Events --- relatively complete inputs, but from March 2016 (see above as well)
  • Price for Single Space --- defined as price for flexible desk, relatively complete inputs
  • Price for Office --- no inputs
  • Twitter Activity (Multinomial or Count) --- High=2/Moderate=1/No=0, no explanations on how to categorize the activity. Also no handles
  • Size (sqft) --- no records for majority of the companies
  • Num Conference Rooms --- no records for majority of the companies
  • Onsite accelerator (binary) --- relatively complete inputs
  • Onsite code school (binary) --- relatively complete inputs
  • Community Membership (binary) --- relatively complete inputs

Mechanical Turk

We have created a page to see our actual work with mechnical turks for this paper here: Hubs: Mechanical Turk.

For more general information on Mechanical Turk, go here: Mechanical Turk (Tool).

  • Twitter activity:

UPDATE (7/14): Updated turk to reflect our desired formats

UPDATE (7/12): AUDIT RESULTS: We noticed

UPDATE (7/11): uploaded and published on amazon's mechanical turk site. Given the time cost to either record number of tweets in a month or look up more than 10 tweets, we decided to record the date of the last 10th tweet. Using a sample of ~10 companies, We noticed minimal differences in data observations among using 10,20, and 30 tweets.

  1. Copy the text in the Search Text into a search engine.
  2. Click on result from twitter.com with the company name. If the link does not appear on the first 3 pages, record DNE for both outputs
  3. Record the company's Twitter Handle into Twitter Handle
  4. Record the date (MM/DD/YY) of that tweet for Twitter Activity. If there are less than 10 tweets, record DNE.


  • Nonprofit: UPDATE: written, not published, on amazon's mechanical turk site

Considerations

  • Difficulties Encountered:
  • Expected Time to Complete:
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments:

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Go to links that describe the company, usually they are labelled: 'About', 'Our Story,' 'Mission'
  4. Look for the key word 'nonprofit'/'non-profit'
  5. If 'nonprofit' is identified, mark as 1, otherwise 0.


  • Key Buzzword

Considerations

  • Difficulties Encountered:
  • Expected Time to Complete:
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments: can be combined with nonprofit to the same turk assignment

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Go to links that describe the company, usually they are labelled: 'About', 'Our Story,' 'Mission'
  4. Look for the key word 'entrepreneurial ecosystem','startup/technology hub', 'community'
  5. If any of the key words is identified, mark as 1, otherwise 0.


  • NUMBER OF EVENTS:

UPDATE (7/13): written, not published, on amazon's mechanical turk site

Considerations

  • Difficulties Encountered:Hard to separate different types of events: workshops, info sessions, meet ups, etc. Most companies put all the events on the same section and do not put event types in the titles of the events. We have to look into the details of the events to find out the type and even we do so some events descriptions do not allow us to determine the type easily. So we will count them all as 'events'.
  • Expected Time to Complete:
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments:

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Look for links related to events, such as 'Events' or 'Calendar' on the homepage.
  4. If not found on the homepage, check 'About' and check 'Community'
  5. Count the number of events in July 2016 and record it. If there is no information of events on the website, record DNE.

Note***: Events include meetups, workshops, info sessions etc. We do not want to count them separately since it is difficult to do so. Most companies put all the events on the same section and do not put event types in the titles of the events. We have to look into the details of the events to find out the type and even we do so some events descriptions do not allow us to determine the type easily. Differentiating the types of the events demands more time and effort and therefore is not suitable to be a mechanical turk project.


  • Onsite Mentors: UPDATE: written, not published, on amazon's mechanical turk site

Considerations

  • Difficulties Encountered:Companies put the information about mentors or mentoring programs in very different places. Some have a specific link or section for mentors/mentoring programs, some put them as a sub-section under 'About' or 'Our Team', others may put them under membership 'benefit' or 'perks'.
  • Expected Time to Complete: 10 - 40 seconds
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments: Some companies give more detailed information about mentors or mentoring programs, but some only mention them in one line. Do we need to treat them differently?

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Look for links related to mentorship such as 'mentors', 'mentorship' or 'mentoring programs'
  4. If the key words can be identified, mark as 1
  5. If there is no explicit 'mentoring' section, look for links related to a description of the company, such as: 'About,' 'Our Team,' 'Our Mission,' etc., look for a subsection or mention of mentor/mentorship/mentoring
  6. If these exist, mark as 1.
  7. If not, go to links related to membership 'benefits,' 'perks,' or related.
  8. Do same process as end of 4 and 5
  9. If there is no mention of mentorship in these sections, type the company, city, and 'mentoring' into a search engine. If a link to a reliable website (such as Desktime) appears and mentorship can be found in the description, mark as 1.
  10. If none of these steps result in a mark of 1, mark as 0



  • Number of Members: UPDATE: written, not published, on amazon's mechanical turk site

Considerations

  • Difficulties Encountered: Some companies don’t list all members but only selective ones. Some companies do not separate current members and alumni and goes like:"we have served more than 120 startups..."
  • Expected Time to Complete:
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments:

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Look for the link 'Members' or 'Residents', usually they are under the links 'Community', 'Membership', 'Our Space' or 'The Space'.
  4. Count the number of members
  5. If the link or section of 'Members' is not found, go the 'Community' and 'Coworking' and look for the description on number of startups/founders/members in the community. Record the number.
  6. If number of members cannot be identified using above steps, record DNE.


  • Sponsors and Partners:UPDATE: written, not published, on amazon's mechanical turk site

Considerations

  • Difficulties Encountered: The company put some other companies' logos on its webpage(usually at the bottom of the homepage) such as Google and Amazon without saying that they are the sponsors or partners
  • Expected Time to Complete:
  • Expectation of Results (accuracy of turk, comprehensiveness):
  • Other Comments:

Procedure

  1. Copy the text in the Search Text into a search engine.
  2. Click on the result that is the website of the company. If there does not exist a listing on the first three pages, mark as DNE.
  3. Look for the link or mention of 'Sponsors','Partners' or 'Supporters', many times of which is often under the section of 'About', 'Community', or related sections
  4. If sponsors or partners can be found mark as 1 and list them, otherwise mark as 0.

Completed Work

Venture Capital Data General Overview

The main goal of the data set is to aggregate company, fund, and round level data to be analyzed at a combined MSA and year level. The data set is compromised of two major parts: a granular company/fund/round and an aggregated CMSA-Year. The data includes all United States Venture Capital transactions (moneytree) from the twenty-five year period of 1990 through 2015.

The Hubs data set, from SDC Platinum, has been constructed in the server:

Data files are in 128.42.44.181/bulk/Hubs
All files are in 128.42.44.182/bulk/Projects/Hubs
psql Hubs2
See the server for the code and ~1st 5 rows of each table 

Procedure - Granular Table

  1. Start with separate raw datasets for Companies, Funds, and Rounds
  2. Add Data to Each Individual dataset (e.g. add MSA code)
  3. Clean and standardize names (e.g. company or fund name) for each dataset
  4. Join the Datasets (here we need to exclude undisclosed companies)

Procedure - CMSA-Year Table

  1. Create a consistent CMSA-Year table to be used later
  2. Using the tables from the granular table, parse out the right data
  3. Join the parsed out data with the CMSA-Year Table
  4. Join these Tables

VC Specific Tables and Procedure

Raw data tables

  1. Funds: fund name, first investment date, last investment date, fund closing date, address, known investment, average investment, number of companies invested, MSA, MSA code.
  2. Rounds: round date, company name, state, round number, stage 1, stage 2, stage 3
  3. Combined Rounds: company name, round date, disclosed amount, investor
  4. Companies: company name, first investment, last investment, MSA, MSA code, address, state, date founded, known funding, industry
  5. MSA List: MSA, MSA code, CMSA, CMSA code
  6. Industry List: changes 6 industry categories to 4— ICT, Life Sciences, Semiconductors, Other


Granular Table (Fund-Round-Company)

The final table here contains all venture capital transactions by disclosed funds and portfolio companies, together with their CMSAs. To get the table, we processed the raw data sets in the following steps:

  1. Clean Company data
    1. Import raw data companies
    2. Add variable 'CMSA' from data set MSA list, update variable 'industry' by joining data set industry list
    3. Remove duplicates and remove undisclosed companies
  2. Clean Fund data
    1. Import raw data funds
    2. Add variable 'CMSA'
    3. Remove duplicates and remove undisclosed funds
    4. Match fund names with itself using [The Matcher (Tool) |The Matcher] to get the standard fund names
  3. Clean Round data
    1. Import raw data rounds and combined rounds
    2. Add variables 'number of investment', 'estimated investment' and 'year'
    3. Remove duplicates and remove undisclosed funds
  4. Combine Companies and Rounds
    1. Combine cleaned companies and rounds data table on company names
    2. Add variable 'round number' and 'stage'
    3. Remove duplicates
  5. Combine Funds and rounds-companies
    1. Match fund names in rounds data table with standard fund names using [The Matcher (Tool) |The Matcher] to standardize fund names in rounds data table
    2. Join standard fund names to rounds-companies table
    3. Join cleaned funds table to rounds-companies table on standard fund names


CMSA-Year Aggregated Table

The final table contains number of companies and amount of investment, categorized by distance and stages, of each CMSA.

We processed data as follows:

  1. Create the CMSA-Year Table
    1. Create single variable tables: Distinct CMSA, year, stage, found year of fund and found year of company.
    2. Create the cross production tables: CMSA-year, CMSA-year-fund year founded and CMSA-year-company year founded
  2. Draw data from cleaned companies, funds and rounds tables
    1. Create a table with 'CMSA', 'number of companies' and 'year Founded' from cleaned companies table and join it to CMSA -year founded
    2. Create a table with 'Company CMSA', 'round year', 'disclosed amount' from rounds-companies combined table, and add stage binary variables. Join it to CMSA-year-company year founded
    3. Create a table with 'CMSA', 'fund year', 'number of investors' from cleaned funds table and join it to CMSA-year-fund year founded
  3. Create near-far and stages table
    1. Add fund data to rounds-companies
    2. Create near-far and stages binary variable
    3. Count investment and deals by CMSA and year, categorized by near-far and stages
  4. Combine all tables by CMSA and round-year

Supplementary Data Sets

Supplementary data sets are cleaned and joined back to CMSAyear table on CMSA and year:

  1. Number of STEM graduate student, by university and year(2005 to 2014).
  2. University R&D spending, by university and year(2004 to 2014).
  3. Income per capital, by MSA and year(2000 to 2012)
  4. Wages and salaries, by MSA and year(2000 to 2012)


The datasets can respectively be found at:

E:\McNair\Projects\Hubs\STEM grads for upload v2.xls
E:\McNair\Projects\Hubs\NSF spending for upload.xls
E:\McNair\Projects\Hubs\Income per capita upload.xls
E:\McNair\Projects\Hubs\Wage for upload v2.xls

Resources

Additional Resources