Changes

Jump to navigation Jump to search
no edit summary
[[Category: McNair Admin]]
 
Alex's notes from creating a test web server that will eventually host important stuff (aka a test run on a cheap Dell Inspiron).
$ sudo mkdir /var/lib/mediawiki
$ sudo mv mediawiki-1.26.2/* /var/lib/mediawiki
 
Then set up the mediawiki directory:
 
$ cd /var/www/html
$ sudo ln -s /var/lib/mediawiki mediawiki
== Mediawiki Security (1/15/16) ==
Then add this line to LocalSettings.php:
require_once ("$IP/extensions/LabeledSectionTransclusion/LabeledSectionTransclusion.php");
[https://en.wikipedia.org/wiki/Wikipedia:Transclusion#Without_using_the_labeled_section_method This Wikipedia page] documents a method for selective transclusion that doesn't require the extension, but I was unable to replicate the results on the test webserver, so I assume that it requires some package or code that is specific to Wikipedia.
Then, per the installation instructions on the [https://www.mediawiki.org/wiki/Extension:MobileFrontend#Installation extension page], add these two lines to LocalSettings.php:
require_once ("$IP/extensions/MobileFrontend/MobileFrontend.php");
$wgMFAutodetectMobileView = true;
The import threw a 500 Server Error, but when I checked the Special:AllPages list of all pages in the Template namespace, the entries for the Template:Infobox, etc. pages show up but when I try to see them, I get another 500 Server Error.
== Debugging Special:Import Errors (2/1/16 - 2/35/16) ==
I already had created a phpinfo page for debugging as described in [http://stackoverflow.com/a/10891317 this Stack Overflow answer]. (Note that $ php -i | grep 'what_you_seek' also works, but the command line interface or CLI may use a different configuration file.)
== A Miraculous Fix (2/8/16) ==
So I loaded the Template:Infobox/doc page on the test wiki this afternoon, and it miraculously loaded! It still has a Lua script error, but at least it's not just 500 server errors all the way. The Lua script error seems to be a timeout error, and from some google searching, it seems that the default timeout length is 10 seconds for Lua, whereas markup-based templates have a 60 second timeout limit. Importing the XML file with the Infobox templates now works as well. Really not sure how this worked... == Configuring Image Uploads (2/8/16) == Might as well try to make some infoboxes. But I need to configure file uploads. There's [https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads a Mediawiki page] that I followed pretty closely. First, I went to the php.ini configuration file and checked that file_uploads was set to On and noted that open_basedir isn't set. Then I set the permissions for the images directory to 755 with:  $ sudo chmod -R 755 /var/lib/mediawiki/images And I also added these lines to the apache2.conf configuration file: <Directory /var/www/wiki/images> Options -Indexes </Directory> Then I set the $wgEnableUploads options to true in LocalSettings.php. When I try to upload a file, however, I get an exception. I added this line to LocalSettings.php to print a backtrace:  $wgShowExceptionDetails = true; And the resulting backtrace/error message:  [b9bfa0ee] /wiki/Special:Upload MWException from line 1873 of /var/lib/mediawiki/includes/filerepo/file/LocalFile.php: Could not acquire lock for 'File-Donald_August_19_(cropped).jpg.' Backtrace: #0 /var/lib/mediawiki/includes/filerepo/file/LocalFile.php(1152): LocalFile->lock() #1 /var/lib/mediawiki/includes/upload/UploadBase.php(708): LocalFile->upload(string, string, string, integer, array, boolean, User) #2 /var/lib/mediawiki/includes/specials/SpecialUpload.php(488): UploadBase->performUpload(string, string, boolean, User) #3 /var/lib/mediawiki/includes/specials/SpecialUpload.php(197): SpecialUpload->processUpload() #4 /var/lib/mediawiki/includes/specialpage/SpecialPage.php(384): SpecialUpload->execute(NULL) #5 /var/lib/mediawiki/includes/specialpage/SpecialPageFactory.php(553): SpecialPage->run(NULL) #6 /var/lib/mediawiki/includes/MediaWiki.php(281): SpecialPageFactory::executePath(Title, RequestContext) #7 /var/lib/mediawiki/includes/MediaWiki.php(714): MediaWiki->performRequest() #8 /var/lib/mediawiki/includes/MediaWiki.php(508): MediaWiki->main() #9 /var/lib/mediawiki/index.php(41): MediaWiki->run() #10 {main} Google searches yield [https://www.mediawiki.org/wiki/Thread:Project:Support_desk/File_Upload_Error this thread] and [https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Problem_With_File_Upload:_Could_not_acquire_lock_for_%22mwstore://local-backend/local-public/1/1e%22. this thread] dealing with this error message. == Installing Ghost (3/14/2016) == You need to make sure you have the correct version of node installed (should be a version of node that Ghost supports; at time of writing, it should be 0.10.x). Credits to [http://www.hostingadvice.com/how-to/install-nodejs-ubuntu-14-04/ this page] for helping me out. $ sudo apt-get install nodejs $ nodejs -v Go ahead and try to remove the old version of node and then clean up any unused packages.  $ sudo apt-get remove --purge node $ sudo apt-get autoremove Make a symbolic link from nodejs to node  $ sudo ln -s /usr/bin/nodejs /usr/bin/node Install npm too and check the version:  $ sudo apt-get install npm $ npm -v Now you can follow the [http://support.ghost.org/installing-ghost-linux/ instructions] for installing Ghost!  $ curl -L https://ghost.org/zip/ghost-latest.zip -o ghost.zip $ sudo apt-get install unzip $ sudo mkdir /var/www/ghost $ unzip -uo ghost.zip -d /var/www/ghost $ cd /var/www/ghost $ sudo npm install $ sudo npm start Note that we had to install the unzip package. I also chose to not install in the [http://support.ghost.org/config/#about-environments production environment] so that I would have more debugging info and the ability to tinker with the theming. == Ghost on Apache? (3/14/2016) == So Ed would rather not serve the blog off of port 2368. Looks like there's documentation for setting Ghost up on [http://support.ghost.org/basic-nginx-config/ nginx] and [https://www.howtoinstallghost.com/how-to-host-ghost-on-an-apache-subdomain/ apache]. Since we would rather not move the mediawiki off of apache, let's just try doing Ghost and apache. Turns out that doing so is a lot more hassle than I initially thought. Ghost and nginx may be easier, and maybe Ghost+apache isn't even that bad, but it's definitely more involved, especially when setting it up alongside another site (the mediawiki). [https://www.howtoinstallghost.com/how-to-host-ghost-on-an-apache-subdomain/ This page] looked helpful, and [https://www.howtoforge.com/tutorial/how-to-install-ghost-blog-on-ubuntu-15.10/#step-install-ghost-blog this post] has a complete tutorial if you want to go through with it, but it seems difficult to get Ghost set up correctly on Apache. == Installing WordPress (3/14/2016) == Following the [http://codex.wordpress.org/Installing_WordPress#Detailed_Instructions Detailed Instructions] to install WordPress was enough to get me started:  $ cd ~/Downloads $ wget https://wordpress.org/latest.tar.gz $ tar -xzvf latest.tar.gz Configure a database for WordPress (can be called something other than wordpress) and make a new MySQL user (can be called something other than mcnair_wp) that has all permissions for the wordpress database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables.  $ mysql -u root -p Enter password:  mysql> CREATE DATABASE wordpress; mysql> GRANT ALL PRIVILEGES ON wordpress.* TO "mcnair_wp"@"localhost" IDENTIFIED BY "password"; mysql> FLUSH PRIVILEGES; mysql> EXIT You can verify that the wordpress database and user were created correctly by logging into the mysql client commandline interface using the new user:  $ mysql -u mcnair_wp -p Enter password: mysql> SHOW DATABASES; mysql> USE wordpress; mysql> EXIT Make a wp-config.php by making a copy of the wp-config-sample.php file and renaming it: $ cp ~/Downloads/wordpress/wp-config-sample.php ~/Downloads/wordpress/wp-config.php $ sudo vi ~/Downloads/wordpress/wp-config.php Edit the lines that define the DB_NAME, DB_USER, and DB_PASSWORD constants to have the values that you used to set up the MySQL database and user above. Copy the wordpress directory to /var/lib/wordpress and then make a symlink from /var/www/blog to /var/lib/wordpress (much like how the mediawiki was done) so that http://128.42.44.22/blog points to the WP blog:  $ sudo cp -r ~/Downloads/wordpress /var/lib/wordpress $ cd /var/www/html $ sudo ln -s /var/lib/wordpress blog Navigate a browser to http://128.42.44.22/blog/wp-admin/install.php to complete the installation (you'll be asked to create an admin user for the WordPress site). == Installing Open Web Analytics (3/21/2016) ==  $ cd ~/Downloads $ git clone https://github.com/padams/Open-Web-Analytics.git $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/Open-Web-Analytics ./owa edit LocalSettings.php and add the following line: require_once('extensions/owa/mw_plugin.php'); Go to the list of Special Pages on the mediawiki and click on the Open Web Analytics special page to install OWA. Long story short, this extension has only been [https://github.com/padams/Open-Web-Analytics/wiki/MediaWiki-Integration tested] up to Mediawiki version 1.16. I tried so hard, and got so far, but in the end, it doesn't even matter. == Installing Piwik (3/21/2016) == Installing Piwik itself ([https://piwik.org/docs/installation/#start-the-installation instructions from Piwik]):  $ cd ~/Downloads $ wget http://builds.piwik.org/piwik.zip && unzip piwik.zip $ sudo cp -r ~/Downloads/piwik /var/lib/piwik $ cd /var/lib/piwik $ sudo chmod 777 tmp $ cd /var/www/html $ sudo ln -s /var/lib/piwik analytics Navigate a browser to http://128.42.44.22/analytics and go through the Piwik installation. Make sure you fix everything on the "System Checks" page. When you get to the Database Setup page, you'll need to configure a MySQL database for Piwik. If you followed the steps for configuring a WordPress database, the steps are almost identical. Configure a database for Piwik (can be called something other than piwik) and make a new MySQL user (can be called something other than mcnair_piwik) that has all permissions for the piwik database. Obviously, you should replace a_secure_password with an actual password for the user (but leave the quotes around the password when typing the MySQL command). FLUSH PRIVILEGES reloads the permissions tables. $ mysql -u root -p Enter password:  mysql> CREATE DATABASE piwik; mysql> GRANT ALL PRIVILEGES ON piwik.* TO "mcnair_piwik"@"localhost" IDENTIFIED BY "password"; mysql> FLUSH PRIVILEGES; mysql> EXIT You can verify that the piwik database and user were created correctly by logging into the mysql client commandline interface using the new user:  $ mysql -u mcnair_piwik -p Enter password:  mysql> SHOW DATABASES; mysql> USE piwik; mysql> EXIT Installing the Piwik Integration extension for Mediawiki:  $ cd ~/Downloads $ wget https://github.com/DaSchTour/piwik-mediawiki-extension/archive/master.zip $ unzip -uo master.zip -d /var/lib/mediawiki/extensions $ cd /var/lib/mediawiki/extensions $ mv piwik-mediawiki-extension-master/ Piwik/ edit LocalSettings.php to add these lines:  require_once("$IP/extensions/Piwik/Piwik.php"); $wgPiwikURL = "128.42.44.22/analytics/"; $wgPiwikIDSite = "1"; But it doesn't seem to register the visits... Turns out Piwik by default honors DoNotTrack (as I learned [http://piwik.org/faq/troubleshooting/faq_58/ here]), so my browser wouldn't register as a visit. So there's visits now. Yay! Also, to make the little graphs next to the numbers not be broken, you have to get the most recent version of the GD module for PHP: $ sudo apt-get install php5-gd $ sudo service apache2 restart == Installing Google Analytics (3/23/2016) ==  $ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/googleAnalytics-REL1_26-d832801.tar.gz $ tar -xzvf googleAnalytics-REL1_26-d832801.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/googleAnalytics ./GoogleAnalytics Add these lines to LocalSettings.php:  require_once("$IP/extensions/GoogleAnalytics/googleAnalytics.php"); // Replace xxxxxxx-x with YOUR GoogleAnalytics UA number $wgGoogleAnalyticsAccount = 'UA-xxxxxxx-x'; == Installing Semantic Mediawiki (3/25/2016) == The [https://www.semantic-mediawiki.org/wiki/Help:Installation/Using_Composer_with_MediaWiki_1.25%2B installation process] looks complicated, but let's be careful. First up, we're going to [https://getcomposer.org/doc/00-intro.md#globally install Composer globally].  $ cd ~/Downloads and then follow the download instructions [https://getcomposer.org/download/ here] to install Composer to the ~/Downloads directory. Then move the composer.phar executable to a directory in our path:  $ sudo mv ~/Downloads/composer.phar /usr/local/bin/composer $ composer --version $ composer --list Now you can just call "composer" instead of doing "php /path/to/composer/composer.phar" Proceeding with the install for SMW...  $ cd /var/lib/mediawiki $ sudo composer require mediawiki/semantic-media-wiki "~2.3" --update-no-dev Make sure to replace "~2.3" with the appropriate latest release version (ignoring the third number, e.g. if the latest release is 2.3.1, then use "~2.3").  $ php maintenance/update.php $ sudo vi LocalSettings.php And add this line to the bottom of LocalSettings.php: enableSemantics('domain_name.com'); Check to see if the mediawiki site recognizes that the extension has been installed by visiting the Special:Version page. You can test that the SMW annotations are working by following the instructions on this page: https://www.semantic-mediawiki.org/wiki/Help:Testing == Installing Semantic Forms (3/25/2016) == Don't install from Mediawiki's Extension Distributor (according to the [https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Download_and_installation documentation])! Instead, get from the Git repository:  $ cd ~/Downloads $ git clone https://git.wikimedia.org/git/mediawiki/extensions/SemanticForms.git $ cp -r ./SemanticForms /var/lib/mediawiki/extensions/SemanticForms $ cd /var/lib/mediawiki $ sudo vi LocalSettings.php Add the following line to LocalSettings.php:  include_once("$IP/extensions/SemanticForms/SemanticForms.php"); == Semantic Forms Examples (3/28/2016) == There's an [https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Quick_start_guide#Example example data structure] on the Semantic Forms page. I followed most of the steps, except for '''Enabling links to forms'''. I couldn't get #formredlink to work properly (the template wasn't parsing the silent property declaration with #set properly), so I instead added a line to the "Was written by" property page:  <nowiki>[[Creates pages with form::Author]]</nowiki> And this way, once a new Book page is created, the redlink for the book's author (when clicked) automatically generates an Author page. I also added #default_form lines at the end of the Book and Author template pages, e.g. for the Author template page, the last few lines looked like: <nowiki> ... [[Category:Authors]] {{#default_form:Author}} </includeonly> </nowiki>and I did something similar for the Book template page. This way, every Book and Author page will have a "Edit with form" tab in addition to the "Edit" tab (and the "edit with form" tab is significantly more useful). == Installing Cargo (3/28/2016) ==  $ cd ~/Downloads $ git clone https://git.wikimedia.org/git/mediawiki/extensions/Cargo.git $ cp -r ./Cargo /var/lib/mediawiki/extensions/Cargo $ cd /var/lib/mediawiki $ sudo vi LocalSettings.php Add the following line to the LocalSettings.php configuration file:  require_once( "$IP/extensions/Cargo/Cargo.php" ); Then back to the console to do some PHP updating: $ php maintenance/update.php == Cargo Examples (3/28/2016) == First, remove Semantic Mediawiki by going to the composer.json file in the Mediawiki root directory and deleting the line that requires the semantic mediawiki package. then run <code>sudo composer update</code> from the Mediawiki root directory. Create templates using the special page under the Semantic Forms category. Then to create the data tables, you go to each template page and choose "Create data table" from the dropdown next to the edit tab. After creating the data table once, it seems that you can create the data from the command line (if you do this before creating the data tables, nothing happens...):  $ cd /var/lib/mediawiki/extensions/Cargo/maintenance $ php cargoRecreateData.php You'll have to edit the template pages to add queries, but at that point, you may just want to write the templates yourself. As with Semantic Mediawiki + Semantic Forms, you can add the #default_form parser function to the template page to display an "edit with form" tab alongside the "edit" tab (you'll likely have to refresh the page to see the changes). == Cargo Data structuring (3/30/2016) == Sahil and I came up with a SQL database schema for the organizations and events in the startup ecosystem. Organization subtypes are: startups, VC funds, accelerators, incubators, service providers. Event subtypes are: financing, training, liquidity. Each subtype has fields specific to it, but all organizations need to have a name, logo, URL, address, founding date, and status, and all events need to have a date and need to include which organizations are involved. We tried doing foreign keys, but you can't do that with Cargo, so maybe we should look into other options. One easy way out would be to just duplicate the columns that are common to all organizations in each organization subtype's table. but this seems like bad practice. I found [https://www.mediawiki.org/wiki/Extension:Cargo/Storing_data#Attaching_to_a_table #cargo_attach] which may help us in this sort of situation. There is an issue with using <code>#cargo_query</code> in the test wiki. When a new page is created that is related to a page that is already in existence, the page that is already in existence should update automatically to display that it is related to the new page. Instead, the old page won't display its relation to the new page until someone goes into the old page, hits the edit button, and saves an edit to the page. Even if there is no change in the page text from the previous version, the old page will now display it's relation to the new page. For example on the page [http://128.42.44.22/wiki/Bolt Bolt], you can see it is directed by Byron Howard, and on the page [http://128.42.44.22/wiki/Byron_Howard Byron Howard], you can see that Bolt is one of his films. This is working as intended; however, on the page [http://128.42.44.22/wiki/Tangled Tangled] it shows that Tangled is also directed by Byron Howard. On Byron Howard's page, it did not show that Tangled was one of his films until I edited the page. It then updated to show Tangled as one of his films.  It seems like the above problem is called because when <code>#cargo_query</code> is used on a page, the query is not called each time the page is refreshed, but rather only when an edit is saved to the page and then saved from then on. This could cause display issues with pages for accelerator's in the future not displaying new companies added. == Back to Semantic Mediawiki (4/8/2016) == Semantic Mediawiki might be better for the inheritance (i.e. foreign keys in SQL). In SMW, we can define the properties and templates for a superclass and subclass. If we make a form that includes both templates, the form creates pages that has the templates for the superclass and subclass included, and the properties are all there. This might be the solution we're looking for, but it seems to be more difficult to query, since there is no explicit distinction between the attributes common to all subclasses and the attributes specific to a single subclass. Ed likes it so far. We should just move forward with the actual data structure on SMW until major roadblocks prevent further progress. Some notes on why SMW seems to be more flexible than Cargo: the properties in SMW are responsible for storing data, whereas the templates in Cargo are responsible for storing data (using #cargo_store calls). This means that a Semantic Form using SMW properties that includes multiple templates is okay, whereas an Semantic Form using Cargo tables needs to ensure that the templates are each affecting different tables, which isn't the case for inheritance.
== To-do list ==
 
* Google Analytics on the test blog. we need FTP access (port 21) to be able to install new plugins, apparently...
 
[[admin_classification::IT Build| ]]

Navigation menu