Difference between revisions of "Web Server Documentation"
imported>Alex |
|||
(76 intermediate revisions by 6 users not shown) | |||
Line 1: | Line 1: | ||
+ | [[Category: McNair Admin]] | ||
+ | |||
+ | =Old Notes (from Alex Jiang)= | ||
+ | |||
== Installing Ubuntu aka Trying RAID 10 (2/15/2016) == | == Installing Ubuntu aka Trying RAID 10 (2/15/2016) == | ||
Line 258: | Line 262: | ||
Let's test on the test web server first. | Let's test on the test web server first. | ||
+ | |||
+ | == Bibtex citations with Bibtex (3/14/2016) == | ||
+ | |||
+ | The [https://www.mediawiki.org/wiki/Extension:Bibtex Bibtex extension] doesn't look like it's being actively maintained, but it might work. I'm testing it on the test web server alongside BibManager. | ||
+ | |||
+ | == Ghost vs. WordPress (3/14/2016) == | ||
+ | |||
+ | So it looks like we may choose Ghost over WordPress. We need something self-hostable, and ideally open-source (and both Ghost and WP satisfy those two conditions). However, I hear Ghost is more lightweight, so if we're not looking for a lot of extra functionality from third-party plugins, Ghost may be the better choice. I'm setting up Ghost on the [[Test Web Server Documentation#Installing Ghost (3/14/2016)|test web server]], so we'll see how it goes... | ||
+ | |||
+ | Turns out Ghost+apache is kinda difficult (definitely more difficult than WordPress+Apache), so let's just try WordPress. | ||
+ | |||
+ | The [[Test Web Server Documentation#Installing WordPress (3/14/2016)|test web server]] had a pretty easy time installing WordPress alongside the existing mediawiki site, so it seems that we'll use WP for the blog on this web server as well. | ||
+ | |||
+ | == Infoboxes (3/16/2016) == | ||
+ | |||
+ | I decide to follow the instructions on [http://trog.qgl.org/20140923/setting-up-infobox-templates-in-mediawiki-v1-23/ this post]. Let's see how it goes. | ||
+ | |||
+ | Step 1: | ||
+ | |||
+ | Download and install the [https://www.mediawiki.org/wiki/Extension:Scribunto Scribunto extension]. | ||
+ | |||
+ | cd ~/Downloads | ||
+ | $ wget https://extdist.wmflabs.org/dist/extensions/Scribunto-REL1_26-9fd4e64.tar.gz | ||
+ | $ tar -xzvf Scribunto-REL1_26-9fd4e64.tar.gz | ||
+ | $ cd /var/lib/mediawiki/extensions | ||
+ | $ cp -r ~/Downloads/Scribunto ./Scribunto | ||
+ | |||
+ | Add these two lines to LocalSettings.php: | ||
+ | |||
+ | require_once("$IP/extensions/Scribunto/Scribunto.php"); | ||
+ | $wgScribuntoDefaultEngine = 'luastandalone'; | ||
+ | |||
+ | And set execute permissions for Lua binaries in the extension: | ||
+ | |||
+ | $ chmod a+x /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua_5_1_5_linux_64_generic/lua | ||
+ | |||
+ | In addition, check that the PCRE version is at least 8.10 (preferable at least 8.33), PHP's mbstring extension is enabled, and PHP's proc_open function is not disabled using a phpinfo page. | ||
+ | |||
+ | Step 2: | ||
+ | |||
+ | Copy Wikipedia's [https://en.wikipedia.org/w/index.php?title=MediaWiki:Common.css&action=edit Common.css] stylesheet into the wiki's Common.css stylesheet. | ||
+ | |||
+ | Step 3: | ||
+ | |||
+ | Export the Infobox template from Wikipedia from the [https://en.wikipedia.org/wiki/Special:Export Special:Export] page. In the "add pages manually" text box, type Template:Infobox and then check all three checkboxes below: "Include only the current revision, not the full history", "Include templates", and "Save as file", then click the Export button and save the XML file. | ||
+ | |||
+ | Step 4: | ||
+ | |||
+ | Import that XML file onto the wiki using the Special:Import page. Choose the "Import to default locations" option. | ||
+ | |||
+ | Step 5: | ||
+ | |||
+ | Test your Infobox template by creating a new page on the mediawiki and using the Infobox template. I used the following code to test: | ||
+ | |||
+ | <nowiki> | ||
+ | {{Infobox | ||
+ | |title = An amazing Infobox | ||
+ | |header1 = It works! | ||
+ | |label2 = Configured by | ||
+ | |data2 = trog | ||
+ | |label3 = Web | ||
+ | |data3 = http://trog.qgl.org/20140923/setting-up-infobox-templates-in-mediawiki-v1-23/ | ||
+ | }}</nowiki> | ||
+ | |||
+ | |||
+ | Debugging: | ||
+ | |||
+ | I seem to have the template functionality working, but it's not styled properly. So let's try exporting and importing Wikipedia's Common.css stylesheet instead of just copying and pasting. And let's also try exporting and importing Wikipedia's Common.js script into the wiki. | ||
+ | |||
+ | Wait, I fixed it by just removing the custom CSS code that I had from trying to change the font-face. If those two things conflict, we may have issues down the line... | ||
+ | |||
+ | I also uncovered something about HTMLTidy that may impact how well templates from Wikipedia run on our mediawiki [https://www.mediawiki.org/wiki/Manual:Using_content_from_Wikipedia#HTMLTidy]. It looks like we can either [https://www.mediawiki.org/wiki/Manual:$wgTidyConfig set an option] in LocalSettings.php to enable HTMLTidy or we can [https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Transwiki get the templates from another source]. | ||
+ | |||
+ | == Installing WordPress (3/16/2016) == | ||
+ | |||
+ | Same as the [[Test Web Server Documentation#Installing WordPress (3/14/2016)|test web server]] | ||
+ | |||
+ | == Google Analytics for Mediawiki and WordPress (3/16/2016) == | ||
+ | |||
+ | There's an [https://www.mediawiki.org/wiki/Extension:Google_Analytics_Integration extension] for google analytics integration on Mediawiki, and it seems to have pretty robust support (you can exclude specific pages or categories from analytics, and you exclude user groups from analytics too). | ||
+ | |||
+ | There's an open-source alternative to google analytics called [http://www.openwebanalytics.com/ Open Web Analytics], and there's [https://www.mediawiki.org/wiki/Extension:Open_Web_Analytics a Mediawiki extension] for that too. Looks like Open Web Analytics has some cool extra features too like click heatmaps... | ||
+ | |||
+ | WordPress appears to have support for both Google Analytics and Open Web Analytics. | ||
+ | |||
+ | After looking around for other open-source alternatives, it appears Piwik is another strong contender. There's a demo of Piwik [http://demo.piwik.org/ here] and a demo of OWA [http://demo.openwebanalytics.com/owa/ here]. There's [https://www.mediawiki.org/wiki/Extension:Piwik_Integration a Mediawiki extension] for Piwik integration, and it seems to be pretty well maintained. WordPress also appears to support Piwik as well. | ||
+ | |||
+ | == Open-source Analytics Alternatives (3/21/2016) == | ||
+ | |||
+ | Might as well try to keep everything open-source. I'll try out Open Web Analytics (OWA) on the test web server to play around with the interface. | ||
+ | |||
+ | OWA isn't going to work, as noted on the [[Test Web Server Documentation#Installing Open Web Analytics (3/21/2016)|test web server page]]. So let's try the [https://www.mediawiki.org/wiki/Extension:Piwik_Integration extension] for Piwik too. | ||
+ | |||
+ | So at least Piwik works. But here's the counterargument: in five years, which is more likely to be well-supported and maintained, Piwik or Google Analytics? And with the obvious answer being Google Analytics, we should just use that. | ||
+ | |||
+ | == Back to Google Analytics (3/23/2016) == | ||
+ | |||
+ | We made a new Google Analytics account! admin@mcnaircenter.org amount | ||
+ | |||
+ | I'm going to go ahead and test the Google Analytics integration extension on the [[Test Web Server Documentation#Installing Google Analytics (3/23/2016)|test web server]]. | ||
+ | |||
+ | == Cargo vs Semantic Mediawiki? (3/25/2016) == | ||
+ | |||
+ | I recently learned about Cargo, which claims to be a more straightforward version of SMW. see the [https://www.semantic-mediawiki.org/w/images/9/9a/Cargo_and_the_future_of_SMW.pdf slides] of a presentation given at the spring 2015 SMWCon, and the Cargo extension page's [https://www.mediawiki.org/wiki/Extension:Cargo/Cargo_and_Semantic_MediaWiki comparison] page. The lead author of the extension, Yaron, is a member of the SMW community, and so Cargo is likely pretty legit. Now I'm not sure which is better... | ||
+ | |||
+ | After some more deliberation, I think Cargo wins. Cargo's querying syntax is more like SQL (which is actually useful and pretty easy to learn), and Cargo also doesn't deal with all of the property declarations that Semantic Mediawiki requires. Also, Cargo has native support for JSON exporting, while SMW doesn't (and any extensions that provide such support are pretty stale). | ||
+ | |||
+ | == CSS Design (4/22/2016) == | ||
+ | |||
+ | Couple of notes on where "obvious" (hint: not so obvious) things are. (Note, all paths that follow are relative to the Mediawiki root directory, which should be in /var/lib/mediawiki). | ||
+ | |||
+ | First, the logo for the page is defined in LocalSettings.php. Look for the $wgLogo variable. I used a FTP client to upload new logos, but you could use a terminal and wget the file if you have it online somewhere. | ||
+ | |||
+ | For changing CSS rules, I just used the Chrome inspector (F12 or right-click and choose "Inspect" from the option menu) to understand which CSS selector rules were being applied and which were being overridden. You can also make small CSS changes in the inspector that are lost upon refreshing the page, but can be useful for experimenting with different colors, positions, etc. | ||
+ | |||
+ | You can use $ grep -r "[words_to_search_for]" on the command line to search for something (a CSS hex color code, a CSS selector, etc.) in all files and directories in the current directory. I usually used this while in the skins/Vector directory to make finding CSS properties easier. | ||
+ | |||
+ | The CSS is actually written in LESS, which is an extension of CSS syntax that allows you to do nested properties, variables, etc. The skins/Vector/variables.less file has all the variables, which are prefixed with an at sign (@) in LESS. WARNING: if you try to use a variable name that hasn't been defined (due to a typo, for example), ALL of the CSS/LESS will stop working. The plus side is that its obvious that you messed up. The down side is that it may not be obvious where exactly you messed up, so make small changes and refresh the browser view constantly. Other than that, most of the other LESS rules are in the skins/Vector/components folder. The file names are fairly reasonable: common.less defines rules common to the entire page, navigation.less defines the area on the left sidebar, personalMenu.less defines the set of links in the top right corner for the user account, footer.less defines the footer. There's also another file in skins/Vector that is useful for understanding how everything comes together: VectorTemplate.php, which contains the high level HTML structure. | ||
== To-do list == | == To-do list == | ||
* extra namespaces for IntraACL stuff. see [https://www.mediawiki.org/wiki/Manual:Using_custom_namespaces here] | * extra namespaces for IntraACL stuff. see [https://www.mediawiki.org/wiki/Manual:Using_custom_namespaces here] | ||
− | * | + | * inconsistent styling: links aren't orange on special pages, fonts and links are the default in the "mobile" view |
− | * | + | |
− | ** | + | == In progress == |
− | ** | + | |
− | ** | + | * Mediawiki CSS styling - '''custom fonts fixed, need new designs/layouts''' |
− | * | + | * analytics - '''getting GA installed for WordPress blogs, need port 21 opened''' |
+ | |||
+ | == Potential pitfalls == | ||
+ | |||
+ | * It looks like the Common.css stylesheet has to be exactly the same as the Wikipedia Common.css stylesheet for the Wikipedia Infobox templates to be styled properly, because I solved the problem of the infoboxes being styled incorrectly by deleting all of the custom CSS that we had written for the mediawiki. | ||
+ | |||
+ | ==Installing and configuring the Backup Drive== | ||
+ | |||
+ | |||
+ | =New Notes= | ||
+ | |||
+ | ==Mounting the RDP== | ||
+ | |||
+ | apt-get install cifs-utils | ||
+ | |||
+ | mount -t cifs //128.42.44.182/mcnair /mnt/rdp -o user=researcher,domain=ad.mcnaircenter.org | ||
+ | |||
+ | ==Mobile Interface== | ||
+ | |||
+ | ===Folders=== | ||
+ | * The folders with the source code can be found at | ||
+ | |||
+ | /var/lib/mediawiki/extensions/MobileFrontend/minerva.less | ||
+ | |||
+ | ===Tips=== | ||
+ | * Using a [http://www.mobilephoneemulator.com/ mobile emulator] helps understand what the mobile interface is going to look like before deploying onto Production. | ||
+ | |||
+ | ==User Access 6/15/2016 == | ||
+ | '''Objective''' | ||
+ | |||
+ | Accounts are to be vetted before they are created. We would like to have a queue of account creation requests, that must be approved before they can be created, given that we allow users to edit public wiki pages. | ||
+ | *Helpful Material: | ||
+ | ** [https://www.mediawiki.org/wiki/Extension:ConfirmAccount Mediawiki Documentation ] | ||
+ | ** mcnair@rice.edu -account that will approve account creation. | ||
+ | |||
+ | Steps Followed: | ||
+ | '''Package Installation Steps:''' | ||
+ | * cd extensions/ | ||
+ | * wget https://extdist.wmflabs.org/dist/extensions/ConfirmAccount-REL1_26-d6e2f46.tar.gz | ||
+ | * tar -xzf ConfirmAccount-REL1_26-d6e2f46.tar.gz | ||
+ | * sudo pear install mail | ||
+ | * sudo pear install net_smtp | ||
+ | The above steps ensure that email notification system is set up, and that the Confirm Account package is set up. | ||
+ | |||
+ | '''Configuring Confirm Accounts php files ''' | ||
+ | The following files need to be updated as follows: | ||
+ | *ConfirmAccount.php: | ||
+ | Set the confirmation queues to point to folders that www-data has access to: | ||
+ | // For changing path in accountreqs | ||
+ | $wgConfirmAccountPathAR = $IP . "/images/accountreqs"; | ||
+ | |||
+ | // For changing path in accountcreds | ||
+ | $wgConfirmAccountPathAC = $IP . "/images/accountcreds"; | ||
+ | |||
+ | *ConfirmAccount.config.php | ||
+ | Change the directories to those defined in ConfirmAccount.php | ||
+ | $wgFileStore['accountreqs']['directory'] : $wgConfirmAccountPathAR, | ||
+ | $wgFileStore['accountcreds']['directory'] : $wgConfirmAccountPathAC, | ||
+ | |||
+ | * LocalSettings.php: | ||
+ | |||
+ | $wgEnableEmail = true; | ||
+ | $wgEmergencyContact = "mcnair@rice.edu"; | ||
+ | $wgPasswordSender = "mcnair@rice.edu"; | ||
+ | # User Account Confirmation | ||
+ | require_once "$IP/extensions/ConfirmAccount/ConfirmAccount.php"; | ||
+ | |||
+ | $wgSMTP = array( | ||
+ | 'host' => 'ssl://smtp.mail.rice.edu', | ||
+ | 'IDHost' => '128.42.44.22', | ||
+ | 'port' => 465, | ||
+ | 'username' => 'mcnair@rice.edu', | ||
+ | 'password' => '*********', | ||
+ | 'auth' => true | ||
+ | ); | ||
+ | $wgConfirmAccountContact = 'mcnair@rice.edu'; | ||
+ | |||
+ | ''' Updating the Wiki''' | ||
+ | * cd /var/lib/mediawiki/maintenance | ||
+ | * php update.php | ||
+ | |||
+ | [[admin_classification::IT Build| ]] | ||
+ | |||
+ | == Mediawiki extensions == | ||
+ | |||
+ | == Semantic Mediawiki Extensions == | ||
+ | The SMW extension installation process requires a composer.phar to be installed. All further installations to SMW are done through the composer.phar. | ||
+ | |||
+ | ==== Installing Mediawiki Composer.phar ==== | ||
+ | Here is the mediawiki link: [https://getcomposer.org/doc/00-intro.md#installation-nix] | ||
+ | |||
+ | ==== Installing Extension : Semantic Results Formats ==== | ||
+ | * Here is the link to the installation process : | ||
+ | * Here is the command to be run in the Mediawiki root folder (var/lib/mediawiki) | ||
+ | php composer.phar require --update-no-dev mediawiki/semantic-result-formats "2.*" | ||
+ | |||
+ | ==Using Google Analytics== | ||
+ | |||
+ | Got to: https://analytics.google.com/ | ||
+ | |||
+ | Log in with admin@mcnaircenter.org | ||
+ | |||
+ | To do: | ||
+ | *Check base analytics configuration | ||
+ | *Check integration with Google Search | ||
+ | *Install appropriate APIs |
Latest revision as of 14:26, 19 March 2017
Contents
- 1 Old Notes (from Alex Jiang)
- 1.1 Installing Ubuntu aka Trying RAID 10 (2/15/2016)
- 1.2 Configuring RAID 1 on Web Server (2/17/2016)
- 1.3 Network Configuration (2/22/2016)
- 1.4 Installing Software (2/24/2016)
- 1.5 Installing Mediawiki (3/7/2016)
- 1.6 Short URLs (3/7/2016)
- 1.7 Labeled Section Transclusion (3/7/2016)
- 1.8 Responsive Design (3/7/2016)
- 1.9 Mediawiki CSS changes (3/9/2016)
- 1.10 Setting up users (3/11/2016)
- 1.11 BibTex citations with BibManager (3/11/2016)
- 1.12 Bibtex citations with Bibtex (3/14/2016)
- 1.13 Ghost vs. WordPress (3/14/2016)
- 1.14 Infoboxes (3/16/2016)
- 1.15 Installing WordPress (3/16/2016)
- 1.16 Google Analytics for Mediawiki and WordPress (3/16/2016)
- 1.17 Open-source Analytics Alternatives (3/21/2016)
- 1.18 Back to Google Analytics (3/23/2016)
- 1.19 Cargo vs Semantic Mediawiki? (3/25/2016)
- 1.20 CSS Design (4/22/2016)
- 1.21 To-do list
- 1.22 In progress
- 1.23 Potential pitfalls
- 1.24 Installing and configuring the Backup Drive
- 2 New Notes
Old Notes (from Alex Jiang)
Installing Ubuntu aka Trying RAID 10 (2/15/2016)
Some general configuration options:
- hostname: McNairWebServ
- user full name: McNair Center
- username: mcnair
- don't encrypt home directory
- manual partitioning (see below for configuration of RAID)
- no automatic updates
- software: LAMP stack
Sahil and I tried to configure RAID 10 using the software RAID option in the installer, which is documented here. We put two 64 GB swap space partitions on the first two hard drives, and created two ext4 partitions that took up the rest of the space on those two drives. For the other two drives, we used a single ext4 partition for each drive. For all of the ext4 partitions, we set the bootable flag to "on." Then we chose to configure the software RAID, created a new MD device, and chose RAID10 with 2 active devices and 2 spare devices. For the active devices, we chose the two ext4 partitions on the first two hard drives, and for the spare devices, we chose the two ext4 partitions on the other two hard drives. But then the installation process fails when the GRUB boot loader can't be installed, because the GUID partition tables (GPT) need a designated, small (1 MB is enough) partition for the GRUB bootloader.
So we started partitioning from scratch, but with only two hard drives for a RAID1 array. In the first drive, there are three partitions: one 1 MB partition reserved for the bootloader, one 64 GB swap partition, and the rest of the drive as an ext4 partition for the filesystem. In the second drive, there are two partitions: one 1 MB partition reserved for the bootloader and the rest of the drive as an ext4 partition for the filesystem. Then we made two software RAID devices, both with one with 2 active devices and 0 spare devices. The first RAID device had both of the bootloader partitions as the active devices, and the second RAID device had both of the ext4 filesystem partitions as the active devices. Then we set the first RAID device to "use as ext4" and the mount as "/boot" and the second RAID device as "use as ext4" and the mount as "/" and then continued with the installation. This time, it failed to install the kernel.
I guessed that, because the 1 MB RAID device was made first, that the kernel tried to install itself to that device and failed. So I went back to the partitioner and set the first RAID device to "do not use" and then tried the installation process again. It prompted me a couple of times warning me that the old filesystem would be overwritten, but I continued the installation regardless. But then the GRUB boot loader failed, even if we tried not installing it to the master boot record (MBR) and installing it to "dev/md0" or installing it to "dev/md0_raid1"
Configuring RAID 1 on Web Server (2/17/2016)
The first RAID device (/dev/md0) we set to use as an ext4 filesystem and mounted /boot to it, and the second RAID device (/dev/md127) we set to use as an ext4 filesystem and mounted / to it (we tried this before, but it failed to install the kernel). This time, it failed to install the bootloader, but it never prompted me to choose where to install the bootloader (usually it asks whether you'd like to install the bootloader to the master boot record).
Second partitioning attempt:
First hard disk (/dev/sda):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 64 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
Second hard disk (/dev/sdb):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
Write partition changes to disk and then start configuring software RAID:
- First RAID device (/dev/md0): RAID1, 2 active devices (/dev/sda3 and /dev/sdb2), 0 spare devices
- Second RAID device (/dev/md1): RAID1, 2 active devices (/dev/sda1 and /dev/sdb1), 0 spare devices
- first RAID device partition: use as ext4 filesystem, mount point /
- second RAID device partition: use as ext4 filesystem, mount point /boot, format data on the partition
Failed to install GRUB bootloader on a hard disk (again).
Next attempt: First RAID device (/dev/md0): use as ext4 filesystem, mount point /, format data on the partition Second RAID device (/dev/md1): erase data on partition, use as "do not use"
Next attempt: Redo the RAID devices so that the first device (/dev/md0): RAID1, 2 active devices (/dev/sda1 and /dev/sdb1), 0 spare devices, and the second RAID device (/dev/md1): RAID1, 2 active devices (/dev/sda3 and /dev/sdb2), 0 spare devices. Then configure the RAID devices:
- first RAID device partition: use as ext4 filesystem, mount point /boot, format data on the partition
- second RAID device partition: use as ext4 filesystem, mount point /, format data on the partition
New idea: ditch the idea of RAID on the boot partitions (we'll put the bootloader on one of the boot partitions and then we can try to set up RAID once we've got the thing booting into Linux), so leave the partitions as above ("Second Partitioning Attempt"). Only make one software RAID device (/dev/md0): RAID1, 2 active devices (/dev/sda3 and /dev/sdb2), 0 spare devices. Then configure the first RAID device partition: use as ext4 filesystem, mount point /, format data on the partition.
Third partitioning attempt:
First hard disk (/dev/sda):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
Second hard disk (/dev/sdb):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
One RAID device (/dev/md0): RAID1, 2 active devices (/dev/sda3 and /dev/sdb3), 0 spare devices. set partition: use as ext4 filesystem, mount point /
Fourth partitioning attempt:
First hard disk (/dev/sda):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
Second hard disk (/dev/sdb):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
First RAID device (/dev/md0): RAID1, 2 active devices (/dev/sda3 and /dev/sdb3), 0 spare devices. set partition: use as ext4 filesystem, mount point /
Second RAID device (/dev/md1): RAID1, 2 active devices (/dev/sda1 and /dev/sdb1), 0 spare devices. set partition: use as ext4 filesystem, mount point /boot
Third RAID device (/dev/md2): RAID0, 2 active devices (/dev/sda2 and /dev/sdb2). set partition: use as swap area
Fifth partitioning attempt (made sure all software RAID devices are removed, delete all partitions, create new partition tables):
First hard disk (/dev/sda):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
Second hard disk (/dev/sdb):
- 10 MB partition, use as reserved BIOS boot area, bootable flag off
- 32 GB partition, use as swap space
- rest of the space partition, use as ext4 filesystem, mount point /, bootable flag off
First RAID device (/dev/md0): RAID1, 2 active devices (/dev/sda3 and /dev/sdb3), 0 spare devices. set partition: use as ext4 filesystem, mount point /
install GRUB bootloader to /dev/sda and /dev/sdb. It works!
Network Configuration (2/22/2016)
As with the test web server, network configuration can be annoying. First, I had to figure out the right LAN port on the mobo by plugging the RJ45 cable in and waiting for the LED to light up (it took about 5 seconds and a couple of tries). Then I went to the terminal to check on the network interfaces:
$ ifconfig $ ifconfig -a $ sudo ifconfig eth0 up $ cat /etc/network/interfaces
After bringing up the eth0 interface (it's down if it's not listed in the output of ifconfig), I then modified /etc/network/interfaces to set up the eth0 interface:
$ sudo vi /etc/network/interfaces
And added these lines:
auto eth0 iface eth0 inet dhcp dns-nameservers 8.8.8.8 8.8.4.4
Then I used ifdown/ifup to reconfigure the interface:
$ sudo ifdown eth0 $ sudo ifup eth0
There's a couple of configuration files that you can check to make sure that the network configured correctly (I compared them to the corresponding files in the test web server):
$ hostname -I $ cat /etc/resolv.conf $ cat /etc/hosts $ cat /var/lib/dhcp/dhclient.eth0.leases
Then I checked if it was connected to the internet:
$ ping google.com $ sudo apt-get update
I got a "GPG error: http://security.ubuntu.com trusty-security InRelease: Clearsigned file isn't valid, got 'NODATA' (does the network require authentication?)" message on the apt-get update a couple of times, so I tried sudo ifdown eth0 and sudo ifup eth0 a couple of times. Then I rebooted the machine and tried to update the package manager again, and it still didn't work.
These results seem familiar; I think I had the same error when I tried to connect the test web server to the internet before Ed filed the ticket with the IT help desk, which suggests that we may have given the wrong MAC address or IT messed up the configuration. Still, I checked all of the configuration files. I only noted a couple of differences between the test web server network interface and this web server network interface:
- The IP addresses are different. The test web server has an address that starts with 128, but this webserver has an address that starts with 10. (Ed thinks this is a sign that this webserver's IP address limits it to the Rice network).
- The subnet masks are different. The test web server has a subnet mask that ends in 240, but this webserver has a mask that ends in 0.
- The test webserver has a DNS domain name (i.e. the output of hostname -d) of attlocal.net. This webserver doesn't have one. I tried adding it (by editing /etc/hosts), but that change alone didn't help.
Interesting side note: going into the mobo BIOS menu, under "Server Mgmt" there is a submenu "BMC network configuration" that shows the MAC address for "DM_LAN1" as ending in de, whereas the MAC address for eth0 ends in dc (otherwise, the two MAC addresses are the same). So maybe the mobo is interfering with the MAC address? But changing DM_LAN1's Config Address source from "Previous State" to "DynamicBmcDhcp" doesn't fix the problem (and upon reboot, it switches back to Previous State).
Turns out IT just configured the network IP addresses incorrectly. Ed and I talked to the IT desk on Tuesday and we got new IP addresses.
Installing Software (2/24/2016)
Now that we have internet connection, we can start getting packages:
$ sudo apt-get update $ sudo apt-get upgrade
Since I didn't install the SSH server in the beginning, I'll go ahead and install the openssh-server package now:
$ sudo apt-get install openssh-server
Backup the SSH server config file:
$ sudo cp /etc/ssh/sshd_config /etc/ssh/sshd_config.original
Installing Mediawiki (3/7/2016)
As with the test web server, I followed the steps from this page on installing Mediawiki.
Make a directory for the stable version of Mediawiki (1.26.2), which isn't available through apt-get, so we're downloading the official tarball!
$ mkdir ~/Downloads $ cd ~/Downloads $ wget https://releases.wikimedia.org/mediawiki/1.26/mediawiki-1.26.2.tar.gz $ tar -xvzf /pathtofile/mediawiki-*.tar.gz
Copy the extracted files to /var/lib/mediawiki:
$ sudo mkdir /var/lib/mediawiki $ sudo mv mediawiki-1.26.2/* /var/lib/mediawiki
Then set up the mediawiki directory:
$ cd /var/www/html $ sudo ln -s /var/lib/mediawiki mediawiki
Now point a browser to http://[ip_address]/mediawiki/mw-config/index.php and configure the Mediawiki site as follows:
Choose both "your language" and the "wiki language" to be English and continue to the next page. Make sure that all of the environmental checks pass before continuing to the next page. Leave the "database host" as localhost and change "database name" to mcnair. Leave "database table prefix" empty and "database username" as root. Set the "database password" to whatever the password for the MySQL user was set as during installation and then continue to the next page. Check the box for "Use this account for installation" and choose InnoDB for "Storage Engine" and choose Binary for "Database character set" and continue to the next page. Set the name of the wiki as McNair Center and let the project namespace be the same as the wiki name. For the administrator account, set the username, password, and email. Choose to subscribe to the release announcements mailing list if you provide an email, and choose to answer more questions.
Choose "open wiki" for the user rights profile. Choose "no license footer". Uncheck the box for "enable outbound email" and choose which skin you'd like to use. For extensions, leave them all unchecked. Leave "enable file uploads" unchecked. Don't change the Logo URL and don't check "enable Instant Commons". For caching, choose "no caching".
Copy the downloaded LocalSettings.php configuration file onto the webserver in the root directory of the mediawiki installation: /var/lib/mediawiki. Then point a browser to http://[ip_address]/mediawiki and see your new site!
Short URLs (3/7/2016)
Same as for the test web server.
Labeled Section Transclusion (3/7/2016)
Same as for the test web server.
Responsive Design (3/7/2016)
Same as for the test web server.
Mediawiki CSS changes (3/9/2016)
Started working with Julia on the mediawiki website CSS design (color scheme and typography on Website Design). Ran into a couple of problems:
- If you upload a file to Slack and want to download it from its URL using the wget command on command-line, make sure you get a public link from the person who uploaded the file, otherwise the file won't be downloaded. (I was trying to figure out why the McNair logo that Julia sent me on slack wasn't showing up on the website, but it turns out I just needed a public link to the file, which should look something like https://files.slack.com/files-pri/T0JA2A9Q9-F0RL0G4BZ/mcnair.png?pub_secret=30505f5d02).
- the @font-face rule doesn't seem to work in Common.css... I never got past this problem. I think the .tff file for the font may have failed to download onto the server properly, but I haven't found a good way to test for that case. Also, I tried using an absolute URL (i.e. http://128.42.44.180/mediawiki/resources/assets/fonts/franklin-gothic-book.ttf) when specifying the @font-face rule, but it doesn't seem to help. Using an absolute URL to the slack file public URL (i.e. https://files.slack.com/files-pri/T0JA2A9Q9-F0RLDB3G8/download/franklin-gothic-book.ttf?pub_secret=327cdaaeb8) doesn't seem to work either.
Well, I don't really trust the file to download onto the webserver properly from terminal, so I got an SFTP client and used that to copy the .ttf file onto the webserver. Still no dice.
Setting up users (3/11/2016)
First, getting the ImportUsers extension for bulk account creation (using a CSV). Downloading the extension is as follows:
$ cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/ImportUsers-REL1_26-0fe9e22.tar.gz $ tar -xzvf ImportUsers-REL1_26-0fe9e22.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/ImportUsers ./ImportUsers
Then edit LocalSettings.php and add this line:
require_once("$IP/extensions/ImportUsers/ImportUsers.php");
Then we just have to make a CSV with columns for username, password, email, real name, and user groups (optional). More info on the extension documentation page.
I made a small little CSV to test the ImportUsers extensions:
user1,pass1,user1@example.com,Dummy One user2,pass2,user2@example.com,Dummy Two user3,pass3,user3@example.com,Dummy Three
After importing the users, run a maintenance script from the command line to update new user statistics:
$ cd /var/lib/mediawiki/maintenance $ php initSiteStats.php
But this runs into some errors (this page suggests setting the MW_INSTALL_PATH environment variable, but I can't find a good way to do that). I looked into the error messages and found this SO post which seems to cover it. I don't know whether we need SNMP, so I decided to just install it to be safe:
$ sudo apt-get install snmp
And the error messages go away. Alternatively, you can disable the snmp module for PHP with:
$ sudo php5dismod snmp
We also want to limit account creation to sysops only as done here. To do this, edit LocalSettings.php and add these lines:
# Prevent new user registrations except by sysops $wgGroupPermissions['*']['createaccount'] = false;
BibTex citations with BibManager (3/11/2016)
The BibManager extension isn't actively maintained, but it doesn't seem like it needs to be constantly updated to accommodate for new features and was last updated for Mediawiki version 1.22, which isn't too bad.
Let's test on the test web server first.
Bibtex citations with Bibtex (3/14/2016)
The Bibtex extension doesn't look like it's being actively maintained, but it might work. I'm testing it on the test web server alongside BibManager.
Ghost vs. WordPress (3/14/2016)
So it looks like we may choose Ghost over WordPress. We need something self-hostable, and ideally open-source (and both Ghost and WP satisfy those two conditions). However, I hear Ghost is more lightweight, so if we're not looking for a lot of extra functionality from third-party plugins, Ghost may be the better choice. I'm setting up Ghost on the test web server, so we'll see how it goes...
Turns out Ghost+apache is kinda difficult (definitely more difficult than WordPress+Apache), so let's just try WordPress.
The test web server had a pretty easy time installing WordPress alongside the existing mediawiki site, so it seems that we'll use WP for the blog on this web server as well.
Infoboxes (3/16/2016)
I decide to follow the instructions on this post. Let's see how it goes.
Step 1:
Download and install the Scribunto extension.
cd ~/Downloads $ wget https://extdist.wmflabs.org/dist/extensions/Scribunto-REL1_26-9fd4e64.tar.gz $ tar -xzvf Scribunto-REL1_26-9fd4e64.tar.gz $ cd /var/lib/mediawiki/extensions $ cp -r ~/Downloads/Scribunto ./Scribunto
Add these two lines to LocalSettings.php:
require_once("$IP/extensions/Scribunto/Scribunto.php"); $wgScribuntoDefaultEngine = 'luastandalone';
And set execute permissions for Lua binaries in the extension:
$ chmod a+x /var/lib/mediawiki/extensions/Scribunto/engines/LuaStandalone/binaries/lua_5_1_5_linux_64_generic/lua
In addition, check that the PCRE version is at least 8.10 (preferable at least 8.33), PHP's mbstring extension is enabled, and PHP's proc_open function is not disabled using a phpinfo page.
Step 2:
Copy Wikipedia's Common.css stylesheet into the wiki's Common.css stylesheet.
Step 3:
Export the Infobox template from Wikipedia from the Special:Export page. In the "add pages manually" text box, type Template:Infobox and then check all three checkboxes below: "Include only the current revision, not the full history", "Include templates", and "Save as file", then click the Export button and save the XML file.
Step 4:
Import that XML file onto the wiki using the Special:Import page. Choose the "Import to default locations" option.
Step 5:
Test your Infobox template by creating a new page on the mediawiki and using the Infobox template. I used the following code to test:
{{Infobox |title = An amazing Infobox |header1 = It works! |label2 = Configured by |data2 = trog |label3 = Web |data3 = http://trog.qgl.org/20140923/setting-up-infobox-templates-in-mediawiki-v1-23/ }}
Debugging:
I seem to have the template functionality working, but it's not styled properly. So let's try exporting and importing Wikipedia's Common.css stylesheet instead of just copying and pasting. And let's also try exporting and importing Wikipedia's Common.js script into the wiki.
Wait, I fixed it by just removing the custom CSS code that I had from trying to change the font-face. If those two things conflict, we may have issues down the line...
I also uncovered something about HTMLTidy that may impact how well templates from Wikipedia run on our mediawiki [1]. It looks like we can either set an option in LocalSettings.php to enable HTMLTidy or we can get the templates from another source.
Installing WordPress (3/16/2016)
Same as the test web server
Google Analytics for Mediawiki and WordPress (3/16/2016)
There's an extension for google analytics integration on Mediawiki, and it seems to have pretty robust support (you can exclude specific pages or categories from analytics, and you exclude user groups from analytics too).
There's an open-source alternative to google analytics called Open Web Analytics, and there's a Mediawiki extension for that too. Looks like Open Web Analytics has some cool extra features too like click heatmaps...
WordPress appears to have support for both Google Analytics and Open Web Analytics.
After looking around for other open-source alternatives, it appears Piwik is another strong contender. There's a demo of Piwik here and a demo of OWA here. There's a Mediawiki extension for Piwik integration, and it seems to be pretty well maintained. WordPress also appears to support Piwik as well.
Open-source Analytics Alternatives (3/21/2016)
Might as well try to keep everything open-source. I'll try out Open Web Analytics (OWA) on the test web server to play around with the interface.
OWA isn't going to work, as noted on the test web server page. So let's try the extension for Piwik too.
So at least Piwik works. But here's the counterargument: in five years, which is more likely to be well-supported and maintained, Piwik or Google Analytics? And with the obvious answer being Google Analytics, we should just use that.
Back to Google Analytics (3/23/2016)
We made a new Google Analytics account! admin@mcnaircenter.org amount
I'm going to go ahead and test the Google Analytics integration extension on the test web server.
Cargo vs Semantic Mediawiki? (3/25/2016)
I recently learned about Cargo, which claims to be a more straightforward version of SMW. see the slides of a presentation given at the spring 2015 SMWCon, and the Cargo extension page's comparison page. The lead author of the extension, Yaron, is a member of the SMW community, and so Cargo is likely pretty legit. Now I'm not sure which is better...
After some more deliberation, I think Cargo wins. Cargo's querying syntax is more like SQL (which is actually useful and pretty easy to learn), and Cargo also doesn't deal with all of the property declarations that Semantic Mediawiki requires. Also, Cargo has native support for JSON exporting, while SMW doesn't (and any extensions that provide such support are pretty stale).
CSS Design (4/22/2016)
Couple of notes on where "obvious" (hint: not so obvious) things are. (Note, all paths that follow are relative to the Mediawiki root directory, which should be in /var/lib/mediawiki).
First, the logo for the page is defined in LocalSettings.php. Look for the $wgLogo variable. I used a FTP client to upload new logos, but you could use a terminal and wget the file if you have it online somewhere.
For changing CSS rules, I just used the Chrome inspector (F12 or right-click and choose "Inspect" from the option menu) to understand which CSS selector rules were being applied and which were being overridden. You can also make small CSS changes in the inspector that are lost upon refreshing the page, but can be useful for experimenting with different colors, positions, etc.
You can use $ grep -r "[words_to_search_for]" on the command line to search for something (a CSS hex color code, a CSS selector, etc.) in all files and directories in the current directory. I usually used this while in the skins/Vector directory to make finding CSS properties easier.
The CSS is actually written in LESS, which is an extension of CSS syntax that allows you to do nested properties, variables, etc. The skins/Vector/variables.less file has all the variables, which are prefixed with an at sign (@) in LESS. WARNING: if you try to use a variable name that hasn't been defined (due to a typo, for example), ALL of the CSS/LESS will stop working. The plus side is that its obvious that you messed up. The down side is that it may not be obvious where exactly you messed up, so make small changes and refresh the browser view constantly. Other than that, most of the other LESS rules are in the skins/Vector/components folder. The file names are fairly reasonable: common.less defines rules common to the entire page, navigation.less defines the area on the left sidebar, personalMenu.less defines the set of links in the top right corner for the user account, footer.less defines the footer. There's also another file in skins/Vector that is useful for understanding how everything comes together: VectorTemplate.php, which contains the high level HTML structure.
To-do list
- extra namespaces for IntraACL stuff. see here
- inconsistent styling: links aren't orange on special pages, fonts and links are the default in the "mobile" view
In progress
- Mediawiki CSS styling - custom fonts fixed, need new designs/layouts
- analytics - getting GA installed for WordPress blogs, need port 21 opened
Potential pitfalls
- It looks like the Common.css stylesheet has to be exactly the same as the Wikipedia Common.css stylesheet for the Wikipedia Infobox templates to be styled properly, because I solved the problem of the infoboxes being styled incorrectly by deleting all of the custom CSS that we had written for the mediawiki.
Installing and configuring the Backup Drive
New Notes
Mounting the RDP
apt-get install cifs-utils
mount -t cifs //128.42.44.182/mcnair /mnt/rdp -o user=researcher,domain=ad.mcnaircenter.org
Mobile Interface
Folders
- The folders with the source code can be found at
/var/lib/mediawiki/extensions/MobileFrontend/minerva.less
Tips
- Using a mobile emulator helps understand what the mobile interface is going to look like before deploying onto Production.
User Access 6/15/2016
Objective
Accounts are to be vetted before they are created. We would like to have a queue of account creation requests, that must be approved before they can be created, given that we allow users to edit public wiki pages.
- Helpful Material:
- Mediawiki Documentation
- mcnair@rice.edu -account that will approve account creation.
Steps Followed: Package Installation Steps:
- cd extensions/
- wget https://extdist.wmflabs.org/dist/extensions/ConfirmAccount-REL1_26-d6e2f46.tar.gz
- tar -xzf ConfirmAccount-REL1_26-d6e2f46.tar.gz
- sudo pear install mail
- sudo pear install net_smtp
The above steps ensure that email notification system is set up, and that the Confirm Account package is set up.
Configuring Confirm Accounts php files The following files need to be updated as follows:
- ConfirmAccount.php:
Set the confirmation queues to point to folders that www-data has access to:
// For changing path in accountreqs $wgConfirmAccountPathAR = $IP . "/images/accountreqs";
// For changing path in accountcreds $wgConfirmAccountPathAC = $IP . "/images/accountcreds";
- ConfirmAccount.config.php
Change the directories to those defined in ConfirmAccount.php $wgFileStore['accountreqs']['directory'] : $wgConfirmAccountPathAR, $wgFileStore['accountcreds']['directory'] : $wgConfirmAccountPathAC,
- LocalSettings.php:
$wgEnableEmail = true; $wgEmergencyContact = "mcnair@rice.edu"; $wgPasswordSender = "mcnair@rice.edu"; # User Account Confirmation require_once "$IP/extensions/ConfirmAccount/ConfirmAccount.php";
$wgSMTP = array(
'host' => 'ssl://smtp.mail.rice.edu', 'IDHost' => '128.42.44.22', 'port' => 465, 'username' => 'mcnair@rice.edu', 'password' => '*********', 'auth' => true
);
$wgConfirmAccountContact = 'mcnair@rice.edu';
Updating the Wiki
- cd /var/lib/mediawiki/maintenance
- php update.php
Mediawiki extensions
Semantic Mediawiki Extensions
The SMW extension installation process requires a composer.phar to be installed. All further installations to SMW are done through the composer.phar.
Installing Mediawiki Composer.phar
Here is the mediawiki link: [2]
Installing Extension : Semantic Results Formats
- Here is the link to the installation process :
- Here is the command to be run in the Mediawiki root folder (var/lib/mediawiki)
php composer.phar require --update-no-dev mediawiki/semantic-result-formats "2.*"
Using Google Analytics
Got to: https://analytics.google.com/
Log in with admin@mcnaircenter.org
To do:
- Check base analytics configuration
- Check integration with Google Search
- Install appropriate APIs