The Modern Yahoo Pipes Alternative … Azure Logic Apps!

Back in 2015, Yahoo! shut down a service I used to love called Yahoo Pipes. It allowed you to combine and filter XML-based RSS feeds to create a new RSS feed containing the data you wanted.

At the time, I looked for a decent alternative and didn’t find anything.. until recently, when I realised that Logic Apps (which I’d been creating for over a year at work) were actually comparable to a next-gen version of Yahoo Pipes, with A LOT more functionality.

Since Logic Apps / Power Automate (used to be called Flow) is a lot more feature-rich, it does mean it’s a little harder to do what you want with the data, but once you’ve developed a few, you’ll see how much more flexible they are, especially when the world has moved to JSON-based APIs, with XML-based RSS taking a backseat.

While Pipes would just output RSS, Logic Apps can output the result into pretty much anything you want.. JSON, email, other APIs, Twitter, spreadsheets (Office 365 and Google Sheets).. the list goes on and on!

You can get started with Logic Apps for free.. and they’re very low cost to run beyond the Azure trial period. You’re talking pennies to run them monthly, depending on run-frequency & complexity. The pricing calculator can help you if you’re worried about cost. If you created 10 Logic Apps, each containing 10 basic actions and 10 standard actions, and ran them all once a day for 30 days it’d cost you 34p a month;

Actions in Logic Apps are the blocks you see in a flow;

It starts with a ‘Trigger’.. so in this case I’ve chosen a Recurrence (schedule), but you could set one up to trigger on a HTTP call from an RSS reader app, something happening on Twitter, or pretty much anything.. here’s a few of the triggers to choose from…

Just some of the Trigger actions available!

Non-Microsoft products are well represented, so don’t go thinking you’re limited to Sharepoint or Office 365, that’s not the case.

You then design the flow how you want it to behave, e.g. get data from X and Y, merge it together, filter it, query an API, persist some data in a sheet, then send back a piece of JSON.

Choose from Built-In, Standard or Enterprise actions.. you can even build your own custom connectors for systems that aren’t currently listed.

List of the Built-In actions (April 2020)

As you’ll see in the video I put together, to replace Yahoo Pipes, you need to be able to merge, filter, manipulate, and sort JSON. You can do a lot of this with the  built-in actions, but you can use Inline Code to achieve anything else.

Inline Code runs NodeJS and has access to standard built-in Javascript objects.

The one thing that’s Logic Apps aren’t great for compared to Yahoo Pipes is sharing what you’ve created.. it was really easy to adapt what someone else made to suit your needs. Logic Apps don’t work like that at all, and you’ll need to roll your own, which takes some trial-and-error & patience.

Basic filtering (etc) is quite straightforward …

… but Inline Code might take you longer to figure out, so to get you started, here are some Inline Code snippets you might find useful;

Simple Text Filter

Multi-Filter Example Including Regex

Generate HTML for an Email

This hasn’t been a step-by-step guide, but has hopefully shown you enough to spark your interest.. get yourself a free trial of Azure, and give it a go!

 

How to list Azure Resources that don’t have Alerts enabled

Alerts are great for letting you know when Logic Apps or Function Apps fail to run for any reason.. the problem is, while you can list out which resources have Alerts, you can’t get a list of resources which don’t have Alerts!

So how do you find out which resources you need to add Alerts to? Well, the CLI has the tools you need, but it’s a bit more complicated than it should be!

First you need a list of resources which have alerts.. you can do that with a command like this;

az resource list --output tsv --resource-type "Microsoft.Insights/metricAlerts" --query [].id

You can run that as part of this command, which is supposed to take the Alert ID list, pipe it into az resource show, and output the value in properties.scopes[0], which is the ID of the resource the alert is set up to monitor.

az resource show --ids $(az resource list --output tsv --resource-type "Microsoft.Insights/metricAlerts" --query [].id) --query properties.scopes[0] --output tsv

However, this fails miserably when the Alert name has a space in it; there ends up being a space in the Alert ID, which makes our command throw an error. UPDATE – Microsoft are figuring out if they can fig it in this bug report I raised.

What I ended up having to do is break up the commands inside a Bash script & use a loop.

This will assign all the Alert IDs to a variable, then replaces spaces with ‘SPACEFIX’. We then loop over the Alert IDs, querying the Alert to find what resource they’re monitoring. ‘SPACEFIX’ is replaced within the loop. It’s somewhat of a hack to get round this silly issue!

There might be a better way to handle this, but I didn’t find it in a hurry.

Now we’ve got a list of resources that have alerting enabled, we need to list out all the resources we want to check.. for this example I’m just interested in Logic Apps, and this script will list them out, one resource per line;

Now we can redirect the output of the 2 scripts into temporary files, and use the ‘comm’ command to show us the lines that exist in one file that don’t in the other.. here’s how it looks;

This gets us what we want.. a list of Logic Apps that don’t have Alerts set up for them. You can pipe the results through grep to check certain Resource Groups, or whatever you need.

Hope this helps save you some time, and gives you a few ideas for useful scripts.

I’ve published some videos on YouTube showing various Azure tips, which you might want to check out!

 

Sync Photos from a Camera SD Card to your Phone

As I get used to owning a Fujifilm X-E3 compact mirrorless APS-C camera, one thing I’ve just figured out is how I can transfer photos and videos off it to my Android phone.

The Fuji app is supposed to do a lot of this automatically, but I’ve found it unreliable (often not connecting to the camera), doesn’t sync video, and needs to use its own WiFi hotspot for the transfer.

What I’ve figured out is I can use a USB on-the-go (OTG) adaptor, USB SD card reader, and a couple of apps to do everything I need. Here’s how you can set up the same workflow:

Get a USB OTG adaptor and USB SD Card Reader. Make sure your phone can read the SD card when it’s attached.

You’ll need to use a file-browsing app to look for the attached USB device; phone manufacturers usually pre-install one for you.

If you can see the files on your SD card, you can move onto the next step!

Next download FolderSync. This will allow you to sync the files from your SD card to your phones’ internal memory. Set up a ‘folder pair’ to sync the files where you want them (even to the Cloud).

Now you’ve got the files syncing from your SD card to your phone, you may need to perfect things depending on the camera you have. Fujifilm cameras  save videos as .MOV files which don’t show up in Google Photos or the gallery app (not on Samsung phones anyway).

To fix this, I’ve used an app called Automate to rename the copied files to .MOV.mp4. That (weirdly) sorts it out!

If you haven’t seen Automate before, it’s a bit like Tasker, but you build the automation (flow) using blocks.

The MOV renamer process I built looks like this;

This can be added to my homescreen as a shortcut which runs the flow when I tap it.

I’ve published the final version of the flow to the Automate community so that you can download and adapt it however you see fit.

To help show you exactly how this works I’ve created a short video. I go into more detail on the flow I used, and you can see the whole thing in action.

How to Remove Adverts from your new Samsung Smart TV

When I bought the Samsung Q9FN flagship TV the last thing I expected was for there to be adverts built into the menu bar, and for movie trailers to start playing when I powered on the TV each time!

For the adverts in the menu bar, take a look in Settings & check the Policy Agreements. Be sure to un-tick any of the options that agree to advertising or tracking. Once you’ve made those changes, the ads should disappear.

The film trailers are coming from built-in apps like TV Plus & Rakuten. It’s not that easy to figure out how to get rid of them; you certainly can’t uninstall the apps.. Samsung don’t let you do that. They’re like the apps that are pre-installed on smartphones.. most of those you can’t remove either.

There’s apparently a maintenance menu that allows you to disable TV Plus, but the jury is out on whether accessing this service menu voids your warranty. There’s an easier way, which I demonstrate in this video, which definitely doesn’t affect your warranty.

Autopilot for Cosmos db – The Cost of Convenience

In the October 2019 update of Azure, Microsoft added ‘Autopilot’ that automatically controls the throughput of a Cosmos d/b. This is handy for unpredictable workloads.. like irregular imports, when you’ll hit the 400 RU maximum and have a Data Factory Pipeline cut out part way.

This can’t be retroactively set on existing Cosmos db containers.. only new ones.

We compared the cost to a d/b with a manual setting of 400 RUs and ran them for a couple of days with no usage.

This it how it looked in Cost Analysis:

Throughput
Daily Cost
Yearly Cost
400 RUs $0.75 $273
600 RUs $1.15 $419
Autopilot 4000 RU max $1.13 $412

As you can see, the standing charge is more expensive for Autopilot… $138/yr more expensive than 400 RUs. But equivalent to running at 600 RUs.

If you have a 400 RU container with predictable high-throughput bursts you can run a script to temporarily increase the RUs, then set them back when you’re done.. that’ll save you money, especially if you have a lot of similarly configured containers.

How to Automate PageSpeed Insights for Multiple URLs on a Schedule using Logic Apps or Flow

For the website I’m responsible for, I was interested in capturing the data from the Google PageSpeed Insights tool, and having the data recorded somewhere on a schedule. There’s a blog post on Moz.com that talked about doing this with a Google Sheet, but it wasn’t quite what I was after; I wanted the data to be collected more regularly.

Instead of using Google Sheets (and a fair amount of code), I decided to use an Azure Logic App (you can use this or Microsoft Flow), which is part of Microsoft’s Cloud platform.

The Logic App is run on a Recurrence trigger which I set to every 6 hours. By collecting the results automatically over time, you’ll see how the changes you’re making to your site affect your PageSpeed scores.

recurrence-hr

The first step simply defines the URLs you want to check, then it’ll loop over each one & call the PageSpeed API. Go get an API key, and make sure PageSpeed API is enabled.

Results from the API call are parsed out and pushed into a new row in an Excel Online sheet.

If you’re interested in setting this up yourself, I recorded a short video which shows how it works in more detail.

There are a few foibles in Logic Apps which caught me out, first, getting the list of URLs into an Array didn’t work as expected. I had to switch to Code View to correct the escaping of the return character to read;

@split(variables('urlList'), '\n')

The JSON payload from the PageSpeed API is pretty large, so I’ve listed the path to the elements you’ll be interested in below. I’m using split (on space) purely to get at the numerical value, which is more useful in the spreadsheet;

First Contentful Paint

@{split(body('HTTP')?['lighthouseResult']?['audits']?['first-contentful-paint']?['displayValue'], ' ')[0]}

First Meaningful Paint

@{split(body('HTTP')?['lighthouseResult']?['audits']?['first-meaningful-paint']?['displayValue'], ' ')[0]}

Speed Index

@{split(body('HTTP')?['lighthouseResult']?['audits']?['speed-index']?['displayValue'], ' ')[0]}

Time To Interactive

@{split(body('HTTP')['lighthouseResult']['audits']['interactive']['displayValue'], ' ')[0]}

Time to First Byte

@{split(body('HTTP')?['lighthouseResult']?['audits']?['time-to-first-byte']?['displayValue'], ' ')[3]}

Overall, this was quite easy to put together and shows the power of Azure Logic Apps. Being able to do this without any code or (your own) servers, and getting things live in a couple of hours is a fantastic tool to have at your disposal.

Make your own £5 ambient TV backlight

After clearing out some junk, which included an old halogen desk lamp, I was thinking about putting in an LED light behind the PC monitor.

Then I remembered I’d bought a ring of 24 RGB LEDs from Aliexpress last year & hadn’t used it in a project.

I also had a spare Arduino Nano, and all the things I’d need to allow me to hook up a dial (potentiometer) for the light level, and button to cycle through different colour modes.

Here’s a quick video of it in action..

Parts

LED Ring – £2
Arduino Nano – £1.75
Breadboard & bits – £1.25

Wiring it up

It’s an easy one to wire up.. I took a few basic examples and mashed them together to get what I wanted from the design.

I’m not an electronics expert, and approached this like I approach software development; write it in manageable/testable chunks, which I can implement and test individually, then bolt it all together.

fritz-led-ring

The code for the project is pretty simple.. I think the most complicated bit is handling the button, which needed debounce functionality.

And here are a few pictures of it in place behind the PC monitor..

Samsung Q9FN Tips, Tricks, Secrets & Problems

The 2018 flagship TV from Samsung is the Q9FN (Amazon link: https://amzn.to/2IvwmkO). This is a FALD (Full Array Local Dimming) display with 480 LEDs lighting the display, rather than being edge-lit like a lot of the models. This generally means it’s much more capable of providing good contrast ratios. Overall I think it’s a good TV … most of the time….

Problems

However, the reality is different from the headlines and reviews in the major publications. What a lot of people have found is that the Contract Enhancement & Local Dimming features can cause light fluctuation problems, which are especially apparent in dark scenes with subtitles.

Light Fluctuations

You can see for yourself in this clip from Narcos Mexico S01E05 at about 45 minutes. This is being viewed via the built-in Netflix player in HDR, with Contrast Enhancement turned on, since without it enabled, dark scenes are waaay too dark to see anything!

It shows how subtitles affect the light levels in other areas of the screen.. like right at the top, nowhere near the subtitles.

Backlight Flicker

This shows how I’m seeing a flicker certain scenes. It’s like the TV can’t quite decide on the light level it’s supposed to display, and clicks into place. I’ve run this at standard speed, then slowed it right down to illustrate the flicker. You’ll need to look closely at the background & look for the light fluctuation.

FALD Confusion!

There are also instances when FALD gets in the way of drawing a background with a solid colour. This video shows a short excerpt from the film Searching (2018).. at about 52 min 30 sec. The FALD back light system has real trouble working out what to do with the dark blue satnav background which should be a solid/uniform colour.. but the bright white roads cause it a lot of problems. This could be a disadvantage of FALD over edge-lit or OLED. At least that’s the way it seems. I’m not entirely sure whether you could even solve this in software.

The blue lights/dots on the left are a reflection of the Xmas tree lights.. so nothing to do with the TV ;)

Tips & Tricks

Okay, enough with the problems, and onto the tips!

Secret Buttons!

At first glance the TV has no physical buttons to control it.. so if you’ve misplaced the remote, it looks like there’s nothing you can do. However, take a look under frame near the logo and there’s a neat directional control + OK button.

Steam Link for Free!

Instead of buying a Steam Link device, there’s actually a free app that lets you stream games from your Gaming PC to your TV. Install the app & plug your controller into the TV and you’re pretty much good to go!

Removing Adverts(!!) from the Menus

Yes, adverts.. in the menus.. on a brand new TV that you paid a lot of money for!

This video shows how I’ve been able to get rid of the movie trailers which annoyingly play so easily. They’re coming from apps like TV Plus & Rakuten. It’s not that easy to figure out how to get rid of them.. you certainly can’t uninstall the apps.. Samsung don’t let you do that :(

I found ads in the menu too.. if you go back into the policy agreements and make sure you’ve not ticked any of the options to agree to them, the ads should go away.

Here’s a link to the Q9FN on Amazon; https://amzn.to/2IvwmkO

Installing SABnzbd on a Raspberry Pi running OSMC

For quite some time I’ve been running SABnzbd on a PC, downloading files, and then transferring them over the local network to a USB drive attached to a Raspberry Pi which is running OSMC. There’s a Linux version of SABnzbd which means I can cut out the PC and have the Pi handle the downloads. It’ll mean I can queue up the downloads from a web interface running on whatever device I have to hand, like an iPad.

First Try

The initial installation of SABnzbd was quite easy;

sudo apt-get install python-openssl unrar par2

sudo apt-get install sabnzbdplus

Edit the settings so that the web client starts up on port 8085..

sudo nano /etc/default/sabnzbdplus

USER=osmc
HOST=0.0.0.0
PORT=8085

sudo service sabnzbdplus restart

This then allowed me to connect to SABnzbd and transfer over all my settings that I was using on my PC.

Delayed Start

What I found was that SABnzbd started before the USB drive was properly mounted by OSMC, so I disabled the main service from starting up, and added a script to wait for the USB drive to get mounted at a particular path.

There was a good forum post here that pointed me in the right direction.

Disable the default service…

sudo update-rc.d sabnzbdplus disable

Write a quick shell script to wait for the directory/USB drive to be mounted…

nano /home/osmc/startsabnzb.sh

#!/bin/sh

# Wait for this folder to be mounted...
DIR=/media/Elements

while [ ! -d "$DIR" ]; do
sleep 120
done

/etc/init.d/sabnzbdplus start

chmod a+x /home/osmc/startsabnzb

Add the script to system startup…

sudo nano /etc/rc.local

/home/osmc/startsabnzb.sh

Upgrading

The version of SABnzbd that installed above was very dated. That repo doesn’t get updated very often. Here’s how I updated it to the latest version.

sudo su root

echo "deb http://ppa.launchpad.net/jcfp/nobetas/ubuntu xenial main" | tee -a /etc/apt/sources.list
echo "deb http://ppa.launchpad.net/jcfp/sab-addons/ubuntu xenial main" | tee -a /etc/apt/sources.list

apt-key adv --keyserver hkp://pool.sks-keyservers.net:11371 --recv-keys 0x98703123E0F52B2BE16D586EF13930B14BB9F05F

sudo apt-get update

Upgrading sabyenc

This solved the issue where SABnzbd was complaining that sabyenc wasn’t the right version. It uses the 2nd repo (sab-addons) we added in the steps above.

sudo apt-get install python-sabyenc

Final thoughts

SABnzbd runs quite well on the Pi. It is a lot slower than it was on a PC.. it only manages about 3 MB/s on the download on a wired connection (compared to 6 MB/s on a Wifi connection on a laptop), and unpacking is slow.

However, the files are unpacked onto the device which I was manually copying the files to anyway, so that saves time.

Samsung Galaxy S9 Super Slow-Mo Videos

The Samsung Galaxy S9 is a bit of a bargain if you look at the International versions. When it retailed for about £700 in the UK, you could import it from Italy for about £500! It’s available now on Amazon for £480.

When I bought the S9 I didn’t realise it had a special video mode called Super Slow-Mo. I’ve had slow mo on a Canon Ixus camera before, but it was poor resolution and needed a lot of light to work.

The S9 shoots super slow-mo at 960 frames per second, and captures 0.2 seconds of motion, stretching it to 6 seconds when you play it back.

One big difference with the S9 is how you trigger super slow-mo. When its set to automatic, a yellow square appears over the video preview.. when you start recording, the S9 looks for motion within the square, and when it sees movement it’ll trigger super slow-mo.

For me, the automatic mode makes it more than a gimmick .. you can get some great videos out of it without much effort. Here are a few (non-professional) examples that I’ve taken over the past few months;

 

 

 

 

 

 

These were quite simple to capture.. just practice a bit to see how it triggers and you can get good results.

Sharing attached USB storage in OSMC using NFS

As well as being attached to the living room TV for use as a media centre, I also wanted to be able to use my Raspberry Pi 3 B+ as a simple NAS for other TVs in the house to stream from.

The Pi I’m using has a 1Tb desktop hard drive attached to it over USB, and I wanted a way to easily share the contents. It was actually relatively easy to set up… this is how to do it in OSMC;

  1. Install SSH to OSMC via the Store
  2. Now you can remote shell into the Pi to set up the network share
  3. Install NFS services using the following command;
    sudo apt-get install nfs-kernel-server
  4. Edit the file shares;
    sudo nano /etc/exports

    Add a share like this;

    /media 192.168.1.0/255.255.255.0(rw,fsid=0,insecure,no_subtree_check,async,crossmnt)

    (crossmnt fixed an issue where I could see the folders but no files)

  5. Restart the NFS service;
    sudo /etc/init.d/nfs-kernel-server restart

That’s it.. you should now be able connect to the Raspberry Pi and see the files on any of the USB drives you’ve got attached.

Automated mains socket power-off for OSMC on a Raspberry Pi

I’ve chosen to replace an ageing mini-PC which I’ve used since 2010 with a new Raspberry Pi 3 B+ running OSMC. It makes for a really capable media centre which can playback newer h.265 HEVC video files at 1080p without any problems, or it can serve 4K files over NFS to a box with a hardware h.265 chip like the Fire TV Box.

This form factor is easy to take on holiday and you can use an old infrared remote control (or Harmony learning remote) with it too.

However, the one thing I’ve struggled with is how to make it easy for my family to use in regard to switching it on and off. The Pi doesn’t have a power button. Some power supplies have an inline rocker switch, which almost fits the bill. I wanted something more automated.

Fortunately I had a spare Energenie power socket from a previous project where I use one to turn off our bass speaker when the TV isn’t on. These power sockets are controlled remotely (over RF) from a Pi which you attach an Energenie control board/shield to.

What I’ve done with the Pi 3 is have it powered through an Energenie socket, and set up a service that executes when it detects OSMC is shutting down. That service will make a quick HTTP call to the Pi with the Energenie controller shield, which will in turn send an RF signal to turn the mains socket off.

 

osmcshutdown

 

Here’s how you can set it up like I have…

Scripts for the Pi running OSMC

First, add a new service script.. create a new file in this folder;

/etc/systemd/system/callenergenie.service

Then enable it with;

sudo systemctl enable callenergenie.service

There are a couple of useful things happening in this service, the After parameter makes sure the code runs before the network code is shut down, and Conflicts parameter is looking for OSMC shutting down.

Now add a helper script… this will make the webserver call as a background task, so control will be given back to the service immediately, rather than it waiting for the wget to complete.

/home/osmc/callenergenie.sh

This calls the PHP script, telling it which socket to turn off, and how long to delay before sending the command, which we’re doing so that the Pi has time to shut down before the power is cut.

Scripts for the Pi with the Energenie shield

This is the PHP script I added to the other Pi which was already configured to be a PHP web server.

/var/www/html/callenergenie.php

To allow PHP to run the script as root, I needed to add the Apache user to the list of sudo-ers.. not that secure tho :( I’d be interested in anyone who knows how to run the Energenie scripts a regular user.. their Python doesn’t like it when it’s not root.

nano /etc/sudoers

www-data ALL=(ALL) NOPASSWD:ALL

The nice thing about the PHP script is that we can actually call it to turn the Pi on remotely too.. so you could configure that into a widget on your phone, or add it to Alexa.

Basic Automated Testing using FeatherTest for Chrome

There are a fair number of automated testing tools out there like Intern or Puppeteer, but I wanted something super simple and quick to set up to test a variety of pages on a site I develop.

I was making changes to JSON-LD, page metadata, and some of the data that’s sent to Omniture analytics. All the pages types are slightly different, containing things like Documents, Discussions, Blog Posts, landing pages, etc. So I needed to go to a variety of URLs on my local dev environment to see what certain DOM elements got set to, and check the contents of some JavaScript variables.

I’d then need to do the same checks against our Dev, Staging and Production servers to make sure they all looked correct there too.

After a bit of searching I came across FeatherTest which is fed a text file where you use JavaScript/jQuery to define & run the tests.

It’s very easy to set up tests, and the same test script file can be run against whatever site you’re looking at in your browser. For more info go here;

https://xaviesteve.com/5302/feathertest-automated-website-testing-extension-google-chrome/

Typical FeatherTest Script Structure + Syntax

Tip 1 – Preserve Log

The output from FeatherTest goes into the console, so if you’re testing multiple pages, you’ll need to check the ‘Preserve Log’ option, otherwise you’ll lose the output as Chrome navigates between pages.

Tip 2 – Output Formatting

When writing scripts, I’d recommend colouring the console output, and prefixing each line with something easily identifiable. In my case I’m using ‘SUDO – ‘ and colouring the text orange.

You can then simply filter the console to just see your output;

When you’ve filtered the output, you can then save it to a file (right-click, save as..) and format it into shape.

I’ve found this really useful to monitor how my SEO data has improved as I’ve made changes, and to sanity check I’ve not broken anything between releases.

Tip 3 – Variable access

Some of the test scripts I’ve written needed access to variables defined in the scope of the main window.. FeatherTest can’t normally access these variables, so we need a helper function.

The helper function either needs baking into your site’s JavaScript, or you can inject it using an extension like TamperMonkey. Using TamperMonkey means you can use it on whatever site you want, not just ones where you’re able to install the function on.

My code adds an event listener that you can call from FeatherTest which you can request variable values from.

e.g.

window.postMessage({ "action": "variable", "value": "myVar.hierarchy"}, window.origin);

This is the TamperMonkey/GreaseMonkey script I use;

FeatherTest Script – Example 1 – DOM Lookup

Pulling data out of the DOM is straightforward;

FeatherTest Script – Example 2 – Variable Access & JSON-LD

To access variables we’ll use the TamperMonkey function I posted above. We can also pull out any JSON-LD splats and present them quite nicely in the console output.

 

Adding IR Remote Control Support to the Raspberry Pi

In my last post I took you through how I created a small portable media centre that I can easily take on holiday to hook up to the hotel TV.

To reduce the amount space it took up, I used a cheap USB keypad which could be used to control the media center. It worked really well & having something hard-wired meant I didn’t have to worry about a Bluetooth-paired device needing re-pairing.

However, what I then realised was it would be good to be able to use a spare remote control instead. I was using the OpenElec distribution and looked through their documentation for how to do this, but only found references to version 3 of the software (it’s on version 7) and how to get LIRC working with it. There were plenty of blog posts on hooking up IR support, but a lot of them were written 2-3 years ago, and the software has moved on somewhat.

Hardware Setup

What I did first was buy a suitable IR receiver. I chose the Vishay TSOP4838 (which costs less than £1) because of the voltage range (2.5-5.5v) and receiver frequency (38KHz). If you look at the datasheet for the product, you’ll see which pins should get wired up to the Pi;

Simply wire pin 1 to GPIO 18, pin 2 to GND, and pin 3 to a 3.3v power pin, e.g.

By using some short F-F jumper wires and a small cut in the side of the case, I was able to position the reciever neatly(ish) on the side.. it’s still easily removable, but you could integrate it into the case a bit more seamlessly than this ;)

Software Setup

Before this project I was using OpenElec, but had limited success getting the IR support working properly. I switched to OSMC which I’d read had better IR support through the main UI. I think I was actually on the right track with OpenElec, but I realised later that the old vintage Xbox remote I was trying to use wasn’t 100% working.

If you’re going to use a remote control that’s officially recognised, then you can jump this part about learning IR remote control codes.

Learning IR remote commands

The remote I found in the loft was an old DVD player remote which (unsurprisingly) wasn’t in the list of pre-recognised remotes in the OSMC installation. I needed to get the Pi to learn the IR pulses being sent out by the remote and map them to the Kodi functions.

1. First off, you need to telnet to the Pi. Username: osmc, Password: osmc.

2. Next you need to stop the LIRC service which is being locked/used by Kodi

sudo systemctl stop lircd_helper@lirc0

3. Now you can run the IR learn mode.. this will record what it finds to the config file you specify;

irrecord -d /dev/lirc0 /home/osmc/lircd.conf

4. Follow the on-screen instructions which will recognise your remote.

One observation I had was that this only worked properly if I stopped after the first prompt to press lots of keys on the remote.. if I completed the second stage, the key mapping didn’t work, e.g.

If I ignored the second phase & let it abort, the learn process worked

When it’s working, you’ll be able to enter the Kodi function (like KEY_UP, KEY_DOWN, etc) & map it to a key press on your remote;

Once you’ve mapped all the functions you want, we then need to move back to OSMC and tell it to use that config file we’ve just written.

OSMC Settings

In OSMC you need to do the following;

1. Disable the CEC service (via System Settings > Input > Peripherals > CEC Adapter), which seems to be needed for LIRC to work.

2. Now go into OSMC settings and pick the Raspberry Pi icon

3. Go into Hardware Support and enabled LIRC GPIO Support. You shouldn’t need to change anything if you connected the sensor to GPIO 18.

4. Now go back and select the Remote Control option.

5. Ignore the list of pre-installed remotes and select Browse;

6. Navigate to the folder where LIRC wrote your config file;

7. Confirm the change & reboot the box;

That should be it.. your remote should be able to control everything in Kodi.

Portable Raspberry Pi Media Center

We recently went on holiday and I took my laptop & VGA cable with me. It was my intention to hook it up to the TV and play some media on it to keep the kids happy on rainy days. However, It turned out the TV had the VGA port covered up by the wall mounting bracket, and my laptop doesn’t have HDMI.. so we ended up putting the laptop on a chair and watching videos from there; it did the job, but wasn’t ideal.

At home we have a Fire TV Stick that could run Kodi, but the problem with Fire TV is that it has to have an internet connection, otherwise it doesn’t work (you can’t even get to Kodi!). Tethering it to my phone isn’t an option, since there are poor mobile signals in a lot of the places we visit.

Next time I’m going to be more prepared, with a more compact and flexible setup consisting of a Raspberry Pi 2 running OpenElec (and Kodi) together with a set of cables allowing me to hook it up to pretty much any TV. The Pi 2 runs Kodi really well, and the OpenElec distribution boots really quickly & has good Wifi and BlueTooth support. I initially chose a compact/travel USB-based keyboard instead of Bluetooth in case OpenElec ‘forgot’ the keyboard and I’d have nothing to navigate the menus to re-pair it.

Cable-wise, I’ve got a 1m standard HDMI cable, which will be fine in most situations.. with a 2m HDMI extension lead if I can’t get the Pi near enough to the TV (some accomodation doesn’t have power sockets where you’d expect them). I’ve also got a RCA lead, with a SCART adapter as well.. so that helps if we get stuck with an older TV.

For media storage I’ve gone with a USB3 Flash Drive with a capacity of 64Gb, which gives us more to play with than the microSD card, and it’s super-fast for copying media from a PC. As soon as you plug in the flash drive, Kodi will show it in the menus.

So that’s it.. nothing groundbreaking or overly difficult to put together. The whole system is small enough to fit in a small travel bag & gives us a lot of flexibility when dealing with different hotels/accommodation. You may just find the TV accepts the USB flash drive and can play back whatever is on it.. but at least you’ll have all the gear you need if it doesn’t ;)

After I made the video, I bought a USB numeric keyboard from eBay for a paltry £2.. that’s compacted the kit even further, allowing it to fit in an old camera bag.

The keypad isn’t instantly recognised by Kodi, but an easy way to get it up and running is to use the Keymap Add-on. Attach a normal USB keyboard and the keypad at the same time.. start the add-on and use the keyboard to activate the remap process. From there, it’s dead simple to map the keypad to the different Kodi functions.

Here’s the full kit list;

1 x Raspberry Pi 2
1 x 8Gb MicroSD card
1 x 2m HDMI extension cable
1 x 1m HDMI cable
1 x USB numeric keypad
1 x RCA to SCART adapter
1 x 3.5mm plug to RCA lead
1 x 64Gb USB3 Flash Drive
1 x USB power supply + cable

Update 1:

Just got back from a week at Center Parcs (Woburn) and was really pleased to find a HDMI socket on the wall, and a power socket.. it made it super easy to hook up the Pi to the TV :)

Update 2:

We spent a week near Blackpool, and the accommodation we stayed in had a patch panel as well! Seems like they’re quite common these days :)