A server I administer running Piwik 2.11.0 web analytics encountered a problem during an automatic upgrade from Piwik 2.11.0 to Piwik 2.11.1.

The error given after clicking the automatic upgrade link was similar to:

curl_exec: Operation timed out after 9984 milliseconds with 12801064 out of 13344050 bytes received.

Hostname requested was: builds.piwik.org

Close, but no cigar. Curl tries to download the 13 megabyte piwik upgrade archive and nearly succeeds but the default curl timeout stops it just short. On servers with huge bandwidth capacity this is probably never a problem but it’s all about getting the file within the default time limit. With insufficient bandwidth, the timeout is reached before the server can finish downloading.

This issue is fixed by using a longer timeout in the latest version. However, if you’re on version 2.11.0 like I was, it doesn’t fix the immediate problem of running into the curl_exec Operation timeout when you try to automatically upgrade to the latest version.

There is a way around which involves editing one line of one file in your current Piwik install. Remember to back-up both your Piwik installation files and your Piwik database before fiddling with the installed files. Misadventure could lead to a broken installation or worse, loss of your analytics data!

Piwik offers this fix in /piwik/plugins/CoreUpdater/Controller.php: https://github.com/piwik/piwik/commit/76e4378b18b2e24853846c38f7eded956ee0eb57 which indicates that the timeout (in seconds) can be changed from 30 to 120 which appears to be the path before the 2.11.2 release but I was stuck on version 2.11.0 which had line 175 reading:

Http::fetchRemoteFile($url, $this->pathPiwikZip);

I therefore changed this line to read:

Http::fetchRemoteFile($url, $this->pathPiwikZip, 0, 120);

I then saved/re-uploaded the file to the server and re-executed the automatic updater which took me straight to the now current 2.11.2 version.

Job done :-)

Comments No Comments »

I  look after a server which is connected to some temperature sensors on a remote site. The site is connected to the internet via a RADSL link on the end of 1.5km of very old copper cable. The reliability of the connection is weak at best and the dynamically assigned IP address tends to change every time the connection drops which can be several times a day.

I recently updated the server from Ubuntu 13.10 to Ubuntu 14.04 Long Term Support which, as it is supported until 2019, should see out the antiquated hardware’s days

In order to communicate with the server remotely, I need to know the current public IP (dynamically assigned) so I need to run a dynamic DNS client on the server to update a remote service. The service I chose for this purpose is No-IP. They offer a free basic DDNS account which meets my needs.

Client wise, I have previously (some years ago) used ddclient to update the IP address to No-IP but have more recently struggled to get the configuration working reliably.

Instead, I now use the NoIP2 client in the following fashion:

First of all, working locally on the server (because that ADSL connection really is that bad!) I download the NoIP2 client:

wget http://www.noip.com/client/linux/noip-duc-linux.tar.gz

Then I extract the archive and cd to the resulting directory:

tar -xvf noip-duc-linux.tar.gz
cd noip-2.1.9-1

The directory may differ if the noip2 client version number is updated.

Now I install the client using:

sudo make install

Next, the client needs to be configured. You’ll need the username and password you selected when you signed up for your No-IP DDNS user account.

sudo /usr/local/bin/noip2 -C

The -C flag tells the installed client to create a new configuration where you enter:

  • Your username (email address for No-IP account)
  • Your password (the one for the above)
  • The interval (in minutes) that you wish to update your public IP address to No-IP
  • A command to execute when the update is successful (you can leave this blank)

At this point, I have a working client that will update your public IP to No-IP. Note that you may need to open port 8245 (outbound) on your router if you have a restrictive outbounds ports setup.

However, if the power on the remote site goes down (and it often does, especially during storms) the server will reboot but there is nothing to tell it to restart the noip2 client.

To remedy this, I create a new file in init.d using the nano text editor:

sudo nano /etc/init.d/noip

I add the following to the file (referenced in the README):

#! /bin/sh
case "$1" in
    start)
        echo "Starting noip2"
        /usr/local/bin/noip2
    ;;
    stop)
        echo -n "Shutting down noip2"
        for i in `noip2 -S 2>&1 | grep Process | awk '{print $2}' | tr -d ','`
        do
          noip2 -K $i
        done
    ;;
    *)
        echo "Usage: $0 {start|stop}"
        exit 1
esac
exit 0

 


The Ctrl-x to exit nano and y to save the file.

I need to make the script executable so I issue:

sudo chmod 755 /etc/init.d/noip

To tell the system it needs to run the script after booting, I open up and edit rc.local

sudo nano /etc/rc.local

Find the line that contains exit “0” and on the line above add:

/etc/init.d/noip start

Again, Ctrl-x to exit nano and y to save.

Now when you reboot your server, the noip2 service should be running.

You can check whether it is running by issuing:

ps aux | grep noip2

You should see the noip2 service listed in the result.

After the update interval that you set when you configured the No-IP client, you should be able to reach services on your server from the no-ip dynamic hostname that you are assigned ie: your-noip-name.no-ip.com. If you’re unsure of this name, you can check it by logging into your No-IP account and looking it up. Don’t forget, you may need to open the relevant ports on your route to match the services you’re trying to reach on your server.

This should provide a relatively simple and free alternative to paying for a fixed ip address for your home or remote Ubuntu 14.04LTS servers. Remember that in order to keep your NoIP hostname, you need to update your IP ever thirty days at the latest or the hostname is released for re-use. Just keep the noip2 client going at a regular update interval and this should not be a problem.

Comments No Comments »

I’ve been using an Oculus Rift DK2 on an AMD laptop with dual graphics and have encountered the following error when using OVR runtime 0.43 or later.

There is a problem with your Rift configuration.
Interfering software is preventing the Rift from activating. Loaded UMD is reported as ‘aticfx32.dll’.

This causes whichever direct-to-rift demo is in use to start but no output to the Rift.

The laptop has “switchable graphics”. Essentially this is a Radeon 8650M for low power tasks and an 8970M/M290X for gaming and high-performance tasks.

If the software used with the Oculus Rift is set to run using the 8650M [power saving] in Catalyst Control Center’s Switchable Graphics Application Settings, the software will work correctly with the Rift albeit at greatly reduced performance. If the software is set to use the 8970M/M290X[high performance], the black screen and “aticfx32.dll” error will be shown.

It seems that the OVR runtime is not recognising the switch from the low power to high performance cards as the application starts.

 

The only workaround I have so far found is to downgrade the OVR runtime to 0.42.

It is likely that the bug will be eliminated in future versions of the OVR runtime. In the mean time, I intend to experiment with different AMD Catalyst driver versions to see if that makes any difference. I am currently using AMD Catalyst version 14.12.

 

If you’ve had success with another Catalyst version and dual-graphics on a OVR runtime later than 0.42, please let me know in the comments section.

Comments No Comments »

Windows 8 was a cheap upgrade path for me as the software that I wanted to run was no-longer supported by XP 64. That software, of course, was the sole reason for running Windows. Video games.

If you’ve tried Windows 8, you’ll probably have had the experience of booting up your desktop PC to what appears top be a tablet or smartphone OS and does nothing but get in the way of the normal desktop tasks you are used to. This ‘start page’, part of the Windows 8 Metro interface would be great if you were looking at 10″ portable touch-screen but the fact of the matter is, no matter how hard you prod your 23″ TFT, all you get are pressure splotches and the keyboard and mouse get lonely.

The term 'Metro' can bring back painful memories for some users.

The term ‘Metro’ can bring back painful memories for some users.

Fear not for help is at hand!

Help comes in the form a an open-source software project known as Classic Shell. This software allows you to return a Start Menu to it’s rightful place on the lower left of the taskbar and access your applications (I’ll choke if I have to now call them ‘apps’) in the manner in which you are used to.

Classic Shell also allows you to add back some popular features such as a full path and status bar to Windows Explorer and normal navigation on IE9.

Classic Shell Start Menu on Windows 8

Thankfully, I only really need Windows 8 as an engine for running games so I am spared the painful task of re-learning my workflow from scratch. I dual boot to Linux for serious tasks and with Gabe Newell’s pro-Linux stance for Valve and it’s Steam platform, the need for Windows in future will hopefully diminish.

Comments No Comments »

If you’ve come across QR Codes before (The square 2D barcodes designed for capture by smart phones) you’ll know they can hold a variety of information.

I recently wanted to offer a map location via a QR Code and handily, the Zebra Crossing Barcode Reader App for Android supports this via the geo: format.

The data format is as follows:

geo:[DecimalLatitude],[DecimalLongitude],[MetricAltitude]

The altitude element (and the comma preceding it) are optional and I’m not quite sure of their value unless some applications allow you to find a room in a tall building this way.

So, if I wanted to give you directions to Paris, France with no altitude information, I would use the following co-ordinates:

Lat: 48.856614 Lon: 2.352222

I need to pass these to a QR Code generator, I use qrencode for Linux which allows you generate QR Codes from the command line.

The following command produces a png image called paris.png of a QR Code which contains the co-ordinates for Paris.

qrencode -o paris.png geo:48.856614,2.352222

 Which produces this QR Code image:

 

 

 

 

When Scanned using the Android Barcode App (The common ZXing one), it offers two immediate options. You can “Show Map” which fires up Google Maps and shows the location or you can “Get Directions”. Depending on the Apps installed, “Get Directions” may offer the choice of getting directions either by Google Maps or via the web browser.

Obviously, this is only an example of how it works for Android users. The geo: tag may work for other handset types, barcode reader apps and operating systems but I have not tested this.

I would be interested to hear from anyone who has tried this using iOS etc.

 

Comments 2 Comments »

For those in higher latitudes, it will be becoming colder now as Winter approaches and it’s worth beginning to think about protecting your electronic devices from the cold.

In the Brave New World of the 21st Century, we now have many devices which are easily portable either in your pocket or in your car. Many of these devices, however are not hardened against the cold and the problems it brings.

So, I have a short piece of simple advice for this Winter.

If you have taken a powered-down device out in the cold or left it in a cold place such as your car for a period of time, do not immediately power it on.

It is quite possible as the cool device was moved into a warmer area that moisture could have condensed onto electronic components and provided a potential short circuit waiting to happen.

So, quite simply allow time for the device to warm back up to room-temperature before switching it on. This may take several hours depending on humidity but will allow the condensation to evaporate. It is worth the wait to protect your devices.

Comments No Comments »

All methods and procedures shown here are carried out entirely at the user's own risk. Please read our disclaimer