Remote Garage Door Sensor/Opener
So, I live on the top floor of a three-storey building. My garage has an electric door and is vertically below my aparment on the 'basement' level. The following post details my efforts to be able to know if the door is open, and to possibly control it remotely. In the past, I've accidently left the door open and I want to be able to prevent this! I also want to be able to access the garage when the keys aren't nearby.
I've already tried a crappy 433mhz universal remote from eBay, which did train to the garage remote. I then went to test it and 1-in-50 key-presses worked. The purchase ended up being a quick and cheap way to work out that the Vicway V-380G has rolling codes!
Fortunately, this model garage-door-opener also has terminal blocks with a screw-terminal named DOOR. I asked the manufacturer if I could just short that terminal with the one next to it named GND and they confirmed that doing so will operate the door! What should I use to do this? Arduino? Raspi? Any of the above? Sure... but how do I communicate with any of them from three concrete storeys above?
- WIRELESS: Does not work... the signal doesn't reach. My mobile phone gets no wifi signal when in the garage. It does work out in the carpark though.
- ETHERNET: Ok sure... run a cable down the side of the building? Is there an inner cable cavity between solid concrete floors? No, but, ethernet-over-power does work! Do I want my LAN exposed to the entire building's power grid? I dunno.
- LORA: Will this be able to communicate through three layers of solid concrete? Wifi couldn't. I'm still going to give it a bash. Will the signal be encrypted? Can any hacker just muck around with my garage door if they wanted to?
POC #1: Ethernet
I have two pairs of ethernet-over-power devices, and I tested a Netcomm set first. Plugging it in saw the home link light mainly solid, but it was probably blinking more than it usually should. I assume the connection through the building's power grid is noisy-AF and, well, I didn't really think I expected anything less. Regardless... I wasn't here to stream 4K60, I just wanted a crappy website hosted on the Arduino with a button and a bit of status information.
An Arduino Ethernet Shield was purchased from Jaycar, along with a tiny relay board. I have a few boards with 4 or 8 relays... but I didn't want to waste them. The example webserver sketch was uploaded to a UNO with the Ethernet shield attached and the unit was moved to the garage with the ethernet-over-power in place.
And ... nothing. Link, but no data. I don't know what I was expecting... ethernet-over-power was not going to work.
POC #2: LoRa with MQTT
There are a lot of great blogs on the internet... and whenever I think of a cool idea for a project... I know someone has already done it... somewhere... somehow. What I wasn't expecting was to find someone in NZ that was considering the exact same setup of components from jaycar. Jon, thank you for the work you put in to documenting the caveats of these older-style components. I went ahead and purchased both the LoRa Sheild (I already had enough spare Arduinos at home) and LoRa Gateway.
The LoRa Gateway is an all-in-one LoRa radio + Arduino + Linux SOC. The SOC talks to the Arduino via SPI and uses Yun's Bridge Protocols. If you search for any of this, then you'll quickly find that all of this tech is OLD. LoRaWAN is the new standard and this is all LoRa-only.
I had plans to host a webpage on the gateway and send data directly to the LoRa node, but I quickly realised that extending the gateway to do my bidding was going to be difficult. It uses flash disk with most files read-only and the website is hosted from ROM! Instead, I'd have to have the gateway actually be a gateway and transmit data to some other server on my LAN.
The setup would therefore be configured as follows...
Garage | Gateway | NAS | ||||||
---|---|---|---|---|---|---|---|---|
Arduino (Read Sensors) | –> | LORA Gateway (Arduino) | –> | LORA Gateway (Linux SOC) | –> | MQTT Broker | <-> | Website |
Arduino (Operate Relay) | <– | LORA Gateway (Arduino) | <– | LORA Gateway (Linux SOC) | <– | –> | SQL Database |
I chose a local MQTT Broker, as I didn't really want to sign up to another external webservice (they seem to use Thingspeak), to only then have to bring the data back down to the local network once again. The local broker wouldn't persist data, so I would also have to work out a mechanism to store in a DB somewhere. Fortunately (hah, or so I thought) I already has MS SQL running in docker on the NAS.
Setting up the Arduino IDE
Following the actual Dragino documentation with a newer version of Arduino will end in misery. I installed the latest 2.3+ version and got no end of errors of missing libraries. So, what to do? Spin up Win7 on my NAS and set up Arduino 1.8.6.
Once installed, add the following url to the Additional Boards Manager URLs: http://www.dragino.com/downloads/downloads/YunShield/package_dragino_yun_test_index.json
Now go to Board Manager and install the Dragino Yun set of boards:
Next, download the RadioHead Library and extract it into Documents\Arduino\libraries.
Finally, select the board, the port and load up the LoRa_Simple_Server_Yun sketch from the Dragino examples.
If it compiles... try an upload! You'll need to enter your LG01's password...
... and then ...
Seamless! Amazing actually. The code has passed through a Linux SOC and then been transmitted over SPI into the Atmel co-CPU? If this didn't work, then write a comment below and tell me what happened. Next up, set up your first node. I used a spare Leonardo with a LoRa Shield.
Download this sketch (or Jon's version), modify the node_id from 12345 to something relevant (leaving the less-than and greater-than symbols in-place) and check the serial monitor on both sides...
It... works! RSSI of -21 is very good... since the nodes are on the workbench, right next to eachother. Let's see what happens when I shift the node to the garage.
Integrate the Garage
The garage door motor has a terminal block wth DOOR and GND. These need to be bridged to activate (open or close) the door... so I bought a relay module and wired it across. Next up, I wanted to know if the door was open or closed, so I bought two magnetic reed switches and installed them at either end of the door chain traversal. I also threw on a temperature and humidity sensor.
With this all wired up, the Arduino was ready to transmit statistics and react to commands.
Provide basic feedback from the Gateway
I was initially going to use an Arduino Buzzer from Jaycar and the Buzzer Library (who doesn't want bits of the Super Mario tune playing!?), so I could get audio feedback to know what the garage door is doing. Unfortunately, including and operating the buzzer interfered with the Yun Bridge Library communication when writing/reading files from the Linux SOC.
It seems that Timer or PWM operation trashes the Process call and no amount of asynchronous or shell-command tinkering worked. Even the FileSystem write command spewed out empty files. I didn't even bother begging for help on the forums as the library is already deprecated. One note though... The official dragino repo has Arduino 1.6.9 for download and I wonder if this is the final 'supported' version with this gateway. Using 1.8.6 might cause these bugs?
Instead, I purchased a self-contained piezo buzzer and wired it up via my transistor power method. I wasn't sure if the digital pins could deal with the current and therefore used the transistor as a switch. I also grabbed a large flashing LED for consistent 'door is open' statii notification.
Attaching all this to the Gateway was easy enough as they provide a really nice screw-terminal block. The pinout is as follows:
1 | 2 | 3 | 4 | 5 | 6 | 7 | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
+ | - | + | - | + | - | + | - | + | - | + | - | a | b |
+5v | GND | A0 | A1 | GND | A3 | A4 | A5 | A6 | A7 | N/A | GND | D3 | D4 |
I chose to integrate with the Digital pins 3 and 4 for my buzzer and LED respectively.
MQTT Broker Installation
This MQTT Server and Node Tutorial helped a lot, but I still didn't want to use Thingspeak. Instead, I overloaded my NAS further with the Eclipse Mosquitto Docker Image.
I created a base folder on the NAS for the home of Mosquitto. In here, I downloaded the default configuration file and created a data and log directory. The default configuration only allows local connections, so we'll need to edit it to allow external nodes to report to it. There's a great guide here, but the basic idea is to add the following line to the configuration file:
# listener port-number [ip address/host name/unix socket path] listener 1883 0.0.0.0 listener 9001 0.0.0.0 protocol websockets
Search for #listener and replace that chunk with the bit above. Now we need to create the server so that we can shell in and create a password file. Port 9001 will be used for WebSocket connections and port 1883 will be for standard TCP connections. I've called it mosqii here, but you can call it whatever you like. Just remember to use the new name in each subsequent command.
#docker create --name mosqii -it -p 1883:1883 -p 9001:9001 -v /volume2/SSD/mosquitto/mosquitto.conf:/mosquitto/config/mosquitto.conf -v /volume2/SSD/mosquitto/data:/mosquitto/data -v /volume2/SSD/mosquitto/log:/mosquitto/log eclipse-mosquitto
Once created, start it with:
$ docker start mosqii
With it up and running, use the following command to connect via sh and create a password file with a username and password of your choice:
$ docker exec -it mosqii sh / # cd mosquitto/config /mosquitto/config # mosquitto_passwd -c garage_door_auth garage_door Password: Reenter password: /mosquitto/config #
Type exit and get out of docker. Stop the docker container with:
$ docker stop mosqii
Edit mosquitto.conf once again, adding the newly-created password file to our listener configuration. We couldn't do this at the start as the server would crash if the file wasn't found.
# listener port-number [ip address/host name/unix socket path] listener 1883 0.0.0.0 listener 9001 0.0.0.0 protocol websockets password_file /mosquitto/config/garage_door_auth
Download MQTT Explorer and connect to the server to test your settings...
Data!
Configure the Gateway for MQTT
With the Gateway connected to my local LAN, I browsed to my router and checked the DHCP listing. duinotech-xxxxx was listed at IP 192.168.1.156 and a quick browse via chrome brought up the internal website. If you've bought one from Jaycar, then the initial password is duinotech.
The unit was quickly upgraded with firmware v4.3.7 and the root password then changed to dragino.
Let it reboot and do its thing... then browse to the Servers and select MQTT with debugging enabled.
Next head to the MQTT Settings page and fill out the private server details. Add a row to the channels table so that we have something to match on when the LoRa node sends data.
With it all configured, return to the system logging and see what's going on. Thankfully the logging is pretty self-explanatory!
Note that initially I had no data flowing to the newly created server. I actually had to SSH into the Gateway to work out what was going on, but even getting in wasn't easy...
$ ssh -l root 192.168.1.156 Unable to negotiate with 192.168.1.156 port 22: no matching key exchange method found. Their offer: diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,kexguess2@matt.ucc.asn.au
Turns out it's using old cipher methods and you need to overload your ssh client to force it to connect...
$ ssh -oKexAlgorithms=+diffie-hellman-group1-sha1 -oHostKeyAlgorithms=+ssh-rsa -l root 192.168.1.156 root@192.168.1.156's password: BusyBox v1.23.2 (2019-01-10 15:05:04 CST) built-in shell (ash) ____ ____ _ ____ ___ _ _ ___ | _ \| _ \ / \ / ___|_ _| \ | |/ _ \ | | | | |_) | / _ \| | _ | || \| | | | | | |_| | _ < / ___ \ |_| || || |\ | |_| | |____/|_| \_\/_/ \_\____|___|_| \_|\___/ W i F i, L i n u x, M C U, E m b e d d e d OpenWRT Chaos Calmer 15.05 Version: Dragino-v2 IoT-4.3.7 Build Wed Sep 11 22:30:26 CST 2019 www.dragino.com ---------------------------------------------------- root@dragino-17b9d2:~#
All scripts are in /etc/iot/scripts/. I was going to inspect mqtt_process.sh, but decided to just follow Jon's instructions and replace the existing file.
root@dragino-17b9d2:~# cd /etc/iot/scripts/ root@dragino-17b9d2:/etc/iot/scripts# ls lg01_pkt_fwd mqtt_process.sh mqtt_process_old.sh mqtt_sub.sh polish_mqtt_config tcp_client tcp_client.lua xively_routine.lua root@dragino-17b9d2:/etc/iot/scripts#
The replacement script is here, but that also caused issues for me. I could see in the web logs that it was complaining that "12345/" didn't match any known configuration keys. It turns out that a trailing slash was causing the issue. This can be fixed on line 66 of Jon's mqt_process.sh by removing the trailing slash so that it reads:
CID=`ls /var/iot/channels/`
With this fix done, the data was flowing to the MQTT Broker and visible in MQTT Explorer!
Web remote
A quick website was spun up in a folder on my Windows machine. It uses MQTT.js and reports the data from the garage.
<html> <head> <script src="https://code.jquery.com/jquery-3.7.1.min.js" integrity="sha256-/JqT3SQfawRcv/BIHPThkBvs0OEvtFFmqPF/lYI/Cxo=" crossorigin="anonymous"></script> <script src="https://unpkg.com/mqtt/dist/mqtt.min.js"></script> <script> const url = 'mqtt://someserver.somewhere.org:9001' const options = { // Clean session clean: true, connectTimeout: 4000, // Authentication clientId: 'garage_door_ctrl', username: 'garage_door', password: 'some_password_here', } const client = mqtt.connect(url, options) client.on('connect', function () { console.log('Connected') // Subscribe to a topic client.subscribe('channels/garage_door/publish/this_is_a_key', function (err) { console.log('ERROR: ' + err); }) }) client.on("message", (topic, message) => { // message is Buffer //console.log(topic.toString() + ": " + message.toString()); var splitted = message.toString().split('&'); $("#temp").text(splitted[0].split("=")[1]); $("#humidity").text(splitted[1].split("=")[1]); if (splitted.length > 3) { op = splitted[2].split("=")[1]; cl = splitted[3].split("=")[1] $("#open_d").text(op); $("#closed_d").text(cl); if (op == "0") { $("#open_close_btn").text("Close Door"); $("#open_close_btn").prop("disabled", ""); } else if (cl == "0") { $("#open_close_btn").text("Open Door"); $("#open_close_btn").prop("disabled", ""); } else { $("#open_close_btn").text("...moving..."); $("#open_close_btn").prop("disabled", "disabled"); } } }); function open_close() { client.publish('channels/garage_commands', 'open_door'); $("#open_close_btn").prop("disabled", "disabled"); } </script> </head> <body> <center> <button onclick="open_close();" id="open_close_btn" disabled="disabled" style="width:90%;height:200px;font-family:tahoma; font-size:32pt;">...determining position...</button> <table style="font-family:tahoma; font-size:32pt;"> <tr> <td>T:<span id=temp>?</span></td> <td>/</td> <td>H:<span id=humidity>?</span></td> <td>/</td> <td>O:<span id=open_d>?</span></td> <td>/</td> <td>C:<span id=closed_d>?</span></td> </tr> <tr> <td colspan=4> <center id=message></center> </td> </tr> </table> </center> </body> </html>
It also sends an "open_door" message to the MQTT server on channels/garage_commands when you hit the button. Note that you need to have enabled websockets on 9001 to be able to use the MQTT.js library!
Sending data back to the Garage
One-day data is great, but how do we get the data flowing in the other direction? Turns out we actually need to subscribe to the MQTT Broker on the LG01 and act when data changes. Unfortunately, the LG01-S doesn't support configuring MQTT subscription settings in the Web UI? The documentation here indicates there should be an MQTT Subscribe configuration area, but we don't have it. Fortunately it does tell us the shell commands!
mosquitto_sub -h 192.168.199.148 -p 1883 -i dragino-1b7060 -t command
So, we can use mosquitto_sub to get commands from the MQTT broker... but how do we then get them into the MCU to send over LoRa? Turns out there's a 'talkback' server demo in the github which makes use of the Yun Process Library once again.
The goal would be to store the commands received from MQTT in a file somewhere on the Linux SOC, without bashing the flash too much and causing it to wear out, and then read the file back into the MCU. The Linux SOC offers a /tmp folder which is actually hosted in RAM, so this'll work for us. We can then read the file via the Yun Process Library on the MCU, and, if the content of the file matches a command that we want to react to, we can then send a command via the LoRa network to the node in the garage and clear the local file. Actually, we should clear the local file regardless, as there are no other consumers and we want to know if/when another MQTT message comes in.
So, on the Gateway MCU side I created an mqtt_sub.sh script in the /etc/iot/scripts/ folder:
#!/bin/sh touch /var/last_command while true do echo "waiting..." mosquitto_sub -C 1 -u garage_door -P l0r@l0r@ -h 192.168.1.61 -p 1883 -t channels/garage_commands > /var/last_command_tmp mv /var/last_command_tmp /var/last_command cat /var/last_command done
Since we now have a file to read, I added the following chunk to the gateway node so that it could send out a message over LoRa:
void checkForCommands() { String lastCommand = ""; Process pDown; pDown.begin("cat"); pDown.addParameter("/tmp/last_command"); pDown.run(); // Run the process and wait for its termination while (pDown.available() > 0) { char c = pDown.read(); if (c != '\n' && c != '\r') lastCommand += c; } if (lastCommand != "") { if (lastCommand == "open_door") { Console.println("Opening door!..."); uint8_t data[] = "OPEN_DOOR"; rf95.send(data, sizeof(data)); rf95.waitPacketSent(); } else { Console.print("Unknown command: "); Console.println(lastCommand); } File last_cmd = FileSystem.open("/tmp/last_command", FILE_WRITE); last_cmd.println(""); last_cmd.close(); } }
And it worked! The message was sent to the LoRa node and the arduino code was updated accordingly to watch for messages and act!
The only note above is that the /var folder is actually a symlink to /tmp, so they can be used interchangeably. Also note that the step to write from the temporary file to the real command file is essential. If you try to pipe out from mosquitto_sub directly then the file will be empty until a message comes in.
I then scheduled this in /etc/rc.local so that it started on boot of the gateway. Make sure you keep the ampersand at the end!
# Put your custom commands here that should be executed once # the system init finished. By default this file does nothing. /etc/iot/scripts/mqtt_sub.sh & exit 0
With this, the unit was listening consistently to the MQTT broker for commands.
Storing the data long-term!
For data persistence... any sane person would use MySql. Unfortunately, I have MS SQL running in docker... because I could... as I use it for other application development. Instead of runnign yet another DB on the NAS, it'd be nice to store this data in there also! Of course, it's never that easy! There's a great example here using python, I need ODBC from Microsoft... and I'm running this raw on the NAS... so no package manager!
So, yeah, this going to have to be ANOTHER docker container? Do I have to? Can this NAS even handle all of this? Can't I run a plugin on the MS SQL docker container? Let's see what's running on the shell inside the SQL container and if I can just run an extra process.
$ docker exec -u 0 -it sql1 /bin/bash root@sql1:/# python3 Python 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> exit() root@sql1:/# pip bash: pip: command not found root@sql1:/# apt-get install pip Reading package lists... Done Building dependency tree... Done Reading state information... Done E: Unable to locate package pip root@sql1:/# apt-get update Get:1 https://packages.microsoft.com/ubuntu/22.04/prod jammy InRelease [3632 B] Get:2 http://security.ubuntu.com/ubuntu jammy-security InRelease [129 kB] Get:3 http://archive.ubuntu.com/ubuntu jammy InRelease [270 kB] Get:4 https://packages.microsoft.com/ubuntu/22.04/prod jammy/main arm64 Packages [40.1 kB] Get:5 https://packages.microsoft.com/ubuntu/22.04/prod jammy/main amd64 Packages [164 kB] Get:6 https://packages.microsoft.com/ubuntu/22.04/prod jammy/main all Packages [1035 B] Get:7 https://packages.microsoft.com/ubuntu/22.04/prod jammy/main armhf Packages [14.6 kB] Get:8 http://security.ubuntu.com/ubuntu jammy-security/restricted amd64 Packages [2771 kB] Get:9 http://archive.ubuntu.com/ubuntu jammy-updates InRelease [128 kB] Get:10 http://archive.ubuntu.com/ubuntu jammy-backports InRelease [127 kB] Get:11 http://archive.ubuntu.com/ubuntu jammy/main amd64 Packages [1792 kB] Get:12 http://security.ubuntu.com/ubuntu jammy-security/multiverse amd64 Packages [44.7 kB] Get:13 http://security.ubuntu.com/ubuntu jammy-security/universe amd64 Packages [1129 kB] Get:14 http://security.ubuntu.com/ubuntu jammy-security/main amd64 Packages [2104 kB] Get:15 http://archive.ubuntu.com/ubuntu jammy/multiverse amd64 Packages [266 kB] Get:16 http://archive.ubuntu.com/ubuntu jammy/restricted amd64 Packages [164 kB] Get:17 http://archive.ubuntu.com/ubuntu jammy/universe amd64 Packages [17.5 MB] Get:18 http://archive.ubuntu.com/ubuntu jammy-updates/restricted amd64 Packages [2858 kB] Get:19 http://archive.ubuntu.com/ubuntu jammy-updates/multiverse amd64 Packages [51.8 kB] Get:20 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 Packages [1420 kB] Get:21 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 Packages [2378 kB] Get:22 http://archive.ubuntu.com/ubuntu jammy-backports/main amd64 Packages [81.0 kB] Get:23 http://archive.ubuntu.com/ubuntu jammy-backports/universe amd64 Packages [33.7 kB] Fetched 33.4 MB in 14s (2340 kB/s) Reading package lists... Done root@sql1:/# apt-get install pip Reading package lists... Done Building dependency tree... Done Reading state information... Done Note, selecting 'python3-pip' instead of 'pip' The following additional packages will be installed: binutils binutils-common binutils-x86-64-linux-gnu build-essential cpp cpp-11 dirmngr dpkg-dev fakeroot fontconfig-config fonts-dejavu-core g++ g++-11 gcc gcc-11 gcc-11-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent gpg-wks-client gpg-wks-server gpgconf gpgsm javascript-common libalgorithm-diff-perl libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan6 libassuan0 libbinutils libc-dev-bin libc-devtools libc6 libc6-dbg libc6-dev libcc1-0 libcrypt-dev libctf-nobfd0 libctf0 libdeflate0 libdpkg-perl libexpat1-dev libfakeroot libfile-fcntllock-perl libfontconfig1 libfreetype6 libgcc-11-dev libgd3 libgdbm-compat4 libgdbm6 libgomp1 libisl23 libitm1 libjbig0 libjpeg-turbo8 libjpeg8 libjs-jquery libjs-sphinxdoc libjs-underscore libksba8 liblocale-gettext-perl liblsan0 libmpc3 libnpth0 libnsl-dev libperl5.34 libpng16-16 libpython3-dev libpython3.10 libpython3.10-dev libpython3.10-minimal libpython3.10-stdlib libquadmath0 libstdc++-11-dev libtiff5 libtirpc-dev libtsan0 libubsan1 libwebp7 libx11-6 libx11-data libxau6 libxcb1 libxdmcp6 libxpm4 linux-libc-dev lto-disabled-list make manpages manpages-dev netbase patch perl perl-modules-5.34 pinentry-curses python3-dev python3-distutils python3-lib2to3 python3-pkg-resources python3-setuptools python3-wheel python3.10 python3.10-dev python3.10-minimal rpcsvc-proto xz-utils zlib1g-dev Suggested packages: binutils-doc cpp-doc gcc-11-locales dbus-user-session libpam-systemd pinentry-gnome3 tor debian-keyring g++-multilib g++-11-multilib gcc-11-doc gcc-multilib autoconf automake libtool flex bison gcc-doc gcc-11-multilib parcimonie xloadimage scdaemon apache2 | lighttpd | httpd glibc-doc git bzr libgd-tools gdbm-l10n libstdc++-11-doc make-doc man-browser ed diffutils-doc perl-doc libterm-readline-gnu-perl | libterm-readline-perl-perl libtap-harness-archive-perl pinentry-doc python-setuptools-doc python3.10-venv python3.10-doc binfmt-support Recommended packages: libnss-nis libnss-nisplus The following NEW packages will be installed: binutils binutils-common binutils-x86-64-linux-gnu build-essential cpp cpp-11 dirmngr dpkg-dev fakeroot fontconfig-config fonts-dejavu-core g++ g++-11 gcc gcc-11 gcc-11-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent gpg-wks-client gpg-wks-server gpgconf gpgsm javascript-common libalgorithm-diff-perl libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan6 libassuan0 libbinutils libc-dev-bin libc-devtools libc6-dev libcc1-0 libcrypt-dev libctf-nobfd0 libctf0 libdeflate0 libdpkg-perl libexpat1-dev libfakeroot libfile-fcntllock-perl libfontconfig1 libfreetype6 libgcc-11-dev libgd3 libgdbm-compat4 libgdbm6 libgomp1 libisl23 libitm1 libjbig0 libjpeg-turbo8 libjpeg8 libjs-jquery libjs-sphinxdoc libjs-underscore libksba8 liblocale-gettext-perl liblsan0 libmpc3 libnpth0 libnsl-dev libperl5.34 libpng16-16 libpython3-dev libpython3.10-dev libquadmath0 libstdc++-11-dev libtiff5 libtirpc-dev libtsan0 libubsan1 libwebp7 libx11-6 libx11-data libxau6 libxcb1 libxdmcp6 libxpm4 linux-libc-dev lto-disabled-list make manpages manpages-dev netbase patch perl perl-modules-5.34 pinentry-curses python3-dev python3-distutils python3-lib2to3 python3-pip python3-pkg-resources python3-setuptools python3-wheel python3.10-dev rpcsvc-proto xz-utils zlib1g-dev The following packages will be upgraded: libc6 libc6-dbg libpython3.10 libpython3.10-minimal libpython3.10-stdlib python3.10 python3.10-minimal 7 upgraded, 103 newly installed, 0 to remove and 14 not upgraded. Need to get 113 MB of archives. After this operation, 318 MB of additional disk space will be used. Do you want to continue? [Y/n]
That's a positive shiteload of dependencies... but we're on the NAS.. I have ~25tb free... so let's goooooo....
Get:106 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-distutils all 3.10.8-1~22.04 [139 kB] Get:107 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-dev amd64 3.10.6-1~22.04 [26.0 kB] Get:108 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-setuptools all 59.6.0-1.2ubuntu0.22.04.1 [339 kB] Get:109 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-wheel all 0.37.1-2ubuntu0.22.04.1 [32.0 kB] Get:110 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-pip all 22.0.2+dfsg-1ubuntu0.4 [1305 kB] Fetched 113 MB in 38s (3007 kB/s) debconf: delaying package configuration, since apt-utils is not installed (Reading database ... 9933 files and directories currently installed.) Preparing to unpack .../libc6-dbg_2.35-0ubuntu3.8_amd64.deb ... Unpacking libc6-dbg:amd64 (2.35-0ubuntu3.8) over (2.35-0ubuntu3.7) ... Preparing to unpack .../libc6_2.35-0ubuntu3.8_amd64.deb ... debconf: unable to initialize frontend: Dialog debconf: (No usable dialog-like program is installed, so the dialog based frontend cannot be used. at /usr/share/perl5/Debconf/FrontEnd/Dialog.pm line 78.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (Can't locate Term/ReadLine.pm in @INC (you may need to install the Term::ReadLine module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.34.0 /usr/local/share/perl/5.34.0 /usr/lib/x86_64-linux-gnu/perl5/5.34 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl-base /usr/lib/x86_64-linux-gnu/perl/5.34 /usr/share/perl/5.34 /usr/local/lib/site_perl) at /usr/share/perl5/Debconf/FrontEnd/Readline.pm line 7.) debconf: falling back to frontend: Teletype /var/lib/dpkg/tmp.ci/preinst: 9: arithmetic expression: expecting primary: "5 * 10000 + 13 * 100 + " dpkg: error processing archive /var/cache/apt/archives/libc6_2.35-0ubuntu3.8_amd64.deb (--unpack): new libc6:amd64 package pre-installation script subprocess returned error exit status 2 Errors were encountered while processing: /var/cache/apt/archives/libc6_2.35-0ubuntu3.8_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) root@sql1:/#
Short-lived excitement. Seems it wants to run a configuration screen but it can't work out how to display a text GUI? Anyway, this is the internet... someone has had this error before. So, remove the half-installed , set up apt-utils and go:
apt install apt-utils echo 'debconf debconf/frontend select Noninteractive' | debconf-set-selections
There's actually a second error above regarding arithmetic. It's trying to calculate a version number, but the NAS is reporting an x for the minor version via uname. We need to mask /bin/uname and make it provide the information we need (thanks to the help over here):
root@sql1:/bin# apt-get install nano ... root@sql1:/bin# mv /bin/uname /bin/uname_old root@sql1:/bin# nano /bin/uname
Paste in the following, replacing the version number appropriately:
#!/bin/sh case "$1" in "-r") echo 5.13.0 ;; *) /bin/uname_old $1 esac
Don't forget to chmod a+x it. You should then get:
root@sql1:/bin# uname -r 5.13.0 root@sql1:/bin# uname -a Linux sql1 5.13.x #1 SMP Wed Jun 12 00:11:51 CST 2024 x86_64 x86_64 x86_64 GNU/Linux
And then, try try try again...
root@sql1:/usr/bin# apt-get install pip Reading package lists... Done Building dependency tree... Done Reading state information... Done Note, selecting 'python3-pip' instead of 'pip' The following additional packages will be installed: binutils binutils-common binutils-x86-64-linux-gnu build-essential... Suggested packages: binutils-doc cpp-doc gcc-11-locales... Recommended packages: libnss-nis libnss-nisplus The following NEW packages will be installed: binutils binutils-common binutils-x86-64-linux-gnu build-essential... The following packages will be upgraded: libc6 libpython3.10 libpython3.10-minimal libpython3.10-stdlib python3.10 python3.10-minimal 6 upgraded, 103 newly installed, 0 to remove and 14 not upgraded. Need to get 99.4 MB of archives. After this operation, 318 MB of additional disk space will be used. Do you want to continue? [Y/n] y ... Get:69 http://archive.ubuntu.com/ubuntu jammy/main amd64 fontconfig-config all 2.13.1-4.2ubuntu5 [29.1 kB] Get:70 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 gnupg-l10n all 2.2.27-3ubuntu2.1 [54.4 kB] Get:71 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 gnupg-utils amd64 2.2.27-3ubuntu2.1 [308 kB] Get:72 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 gpg amd64 2.2.27-3ubuntu2.1 [519 kB] Get:73 http://archive.ubuntu.com/ubuntu jammy/main amd64 pinentry-curses amd64 1.1.1-1build2 [34.4 kB] Get:74 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 gpg-agent amd64 2.2.27-3ubuntu2.1 [209 kB] ... Fetched 99.4 MB in 46s (2182 kB/s) Extracting templates from packages: 100% Preconfiguring packages ... (Reading database ... 9572 files and directories currently installed.) Preparing to unpack .../libc6_2.35-0ubuntu3.8_amd64.deb ... Unpacking libc6:amd64 (2.35-0ubuntu3.8) over (2.35-0ubuntu3.7) ... ... Setting up libalgorithm-diff-xs-perl (0.04-6build3) ... Setting up libalgorithm-merge-perl (0.08-3) ... Setting up libpython3-dev:amd64 (3.10.6-1~22.04) ... Setting up python3-dev (3.10.6-1~22.04) ... Processing triggers for libc-bin (2.35-0ubuntu3.7) ... /sbin/ldconfig.real: /opt/mssql/lib/libc++.so.1 is not a symbolic link /sbin/ldconfig.real: /opt/mssql/lib/libc++abi.so.1 is not a symbolic link root@sql1:/# pip install paho-mqtt Collecting paho-mqtt Downloading paho_mqtt-2.1.0-py3-none-any.whl (67 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 67.2/67.2 KB 40.3 kB/s eta 0:00:00 Installing collected packages: paho-mqtt Successfully installed paho-mqtt-2.1.0 root@sql1:/#
Errors? But, yey, paho-mqtt is installed! Install pyodbc also, and then grab the scripts and follow the instructions for the MS SQL driver over here. I copied the ubuntu scripts, removed sudo as it's not installed, installed curl as it's required and then kicked it off.
if ! [[ "16.04 18.04 20.04 22.04" == *"$(lsb_release -rs)"* ]]; then echo "Ubuntu $(lsb_release -rs) is not currently supported."; exit; fi curl https://packages.microsoft.com/keys/microsoft.asc | tee /etc/apt/trusted.gpg.d/microsoft.asc curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list | tee /etc/apt/sources.list.d/mssql-release.list apt-get update ACCEPT_EULA=Y apt-get install -y msodbcsql17 # optional: for bcp and sqlcmd ACCEPT_EULA=Y apt-get install -y mssql-tools echo 'export PATH="$PATH:/opt/mssql-tools/bin"' >> ~/.bashrc source ~/.bashrc # optional: for unixODBC development headers apt-get install -y unixodbc-dev
root@sql1:/database_tools# ./inst_ms.sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 983 100 983 0 0 231 0 0:00:04 0:00:04 --:--:-- 240 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 88 100 88 0 0 27 0 0:00:03 0:00:03 --:--:-- 27 Hit:1 https://packages.microsoft.com/ubuntu/22.04/prod jammy InRelease Get:2 http://security.ubuntu.com/ubuntu jammy-security InRelease [129 kB] Hit:3 http://archive.ubuntu.com/ubuntu jammy InRelease Get:4 http://archive.ubuntu.com/ubuntu jammy-updates InRelease [128 kB] Hit:5 http://archive.ubuntu.com/ubuntu jammy-backports InRelease Fetched 257 kB in 9s (29.6 kB/s) Reading package lists... Done Reading package lists... Done Building dependency tree... Done Reading state information... Done msodbcsql17 is already the newest version (17.10.6.1-1). msodbcsql17 set to manually installed. 0 upgraded, 0 newly installed, 0 to remove and 14 not upgraded. Reading package lists... Done Building dependency tree... Done Reading state information... Done mssql-tools is already the newest version (17.10.1.1-1). 0 upgraded, 0 newly installed, 0 to remove and 14 not upgraded. Reading package lists... Done Building dependency tree... Done Reading state information... Done The following additional packages will be installed: libodbccr2 The following NEW packages will be installed: libodbccr2 unixodbc-dev 0 upgraded, 2 newly installed, 0 to remove and 14 not upgraded. Need to get 264 kB of archives. After this operation, 1895 kB of additional disk space will be used. Get:1 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libodbccr2 amd64 2.3.9-5ubuntu0.1 [16.7 kB] Get:2 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unixodbc-dev amd64 2.3.9-5ubuntu0.1 [248 kB] Fetched 264 kB in 8s (33.1 kB/s) Selecting previously unselected package libodbccr2:amd64. (Reading database ... 19952 files and directories currently installed.) Preparing to unpack .../libodbccr2_2.3.9-5ubuntu0.1_amd64.deb ... Unpacking libodbccr2:amd64 (2.3.9-5ubuntu0.1) ... Selecting previously unselected package unixodbc-dev:amd64. Preparing to unpack .../unixodbc-dev_2.3.9-5ubuntu0.1_amd64.deb ... Unpacking unixodbc-dev:amd64 (2.3.9-5ubuntu0.1) ... Setting up libodbccr2:amd64 (2.3.9-5ubuntu0.1) ... Setting up unixodbc-dev:amd64 (2.3.9-5ubuntu0.1) ... Processing triggers for libc-bin (2.35-0ubuntu3.7) ... /sbin/ldconfig.real: /opt/mssql/lib/libc++.so.1 is not a symbolic link /sbin/ldconfig.real: /opt/mssql/lib/libc++abi.so.1 is not a symbolic link root@sql1:/database_tools#
Before-long, SQL drivers were installed. Now, the meat. We need a python script to watch MQTT. Using chunks from this tutorial, I came up with the following:
#!/usr/bin/python3 import paho.mqtt.client as mqtt import pyodbc from datetime import datetime #MQTT Settings brokerAddress = "localhost" userName = "garage_door" passWord = "password_here" subscribeTopic = "garage_door/publish/this_is_a_key" # Connect to Database driver = "{ODBC Driver 17 for SQL Server}" server = "localhost" database = "MQTT_DATA" username = "mqtt_user" password = "password_here" connectionString = "DRIVER=" + driver + ";SERVER=" + server + ";DATABASE=" + database + ";UID=" + username + ";PWD=" + password conn = pyodbc.connect(connectionString) cursor = conn.cursor() # The callback for when the client receives a CONNACK response from the server. def on_connect(client, userdata, flags, rc): if rc == 0: print("Connected successfully") else: print("Connect returned result code: " + str(rc)) # The callback for when a PUBLISH message is received from the server. def on_message(client, userdata, msg): topic = msg.topic measurementValue = msg.payload.decode("utf-8") SaveToDatabase(topic, measurementValue) def SaveToDatabase(topic, measurementValue): print(topic + " " + measurementValue) #Find Date and Time now = datetime.now() datetimeformat = "%Y-%m-%d %H:%M:%S" measurementDateTime = now.strftime(datetimeformat) # Insert Data into Database query = "INSERT INTO MEASUREMENTDATA (SensorName, MeasurementValue, MeasurementDateTime) VALUES (?,?,?)" sensorName = topic parameters = sensorName, measurementValue, measurementDateTime cursor.execute(query, parameters) cursor.commit() # Create the MQTT client client = mqtt.Client() client.on_connect = on_connect client.on_message = on_message #client.tls_set(tls_version=mqtt.ssl.PROTOCOL_TLS) #this may cause issues if not set up. client.username_pw_set(userName, passWord) client.connect(brokerAddress, 1883) client.subscribe(subscribeTopic) client.loop_forever()
With all the correct drivers installed, I got the following:
root@sql1:/database_tools# ./mqtt_watcher.py Traceback (most recent call last): File "/database_tools/./mqtt_watcher.py", line 20, inconn = pyodbc.connect(connectionString) pyodbc.InterfaceError: ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'mqtt_user'. (18456) (SQLDriverConnect)")
HAHA! Yes. It hates me.. and it should... none of those credentials are correct. I went and corrected the auth, the table and the insert schema and the data was logged!
Green Hills, Nou, Itoigawa – June, 2024
Whilst driving with friends from Osaka to Aizuwakamatsu, we needed a place to split the trip and I started Googling. Toyama was a logical mid-way point, but I didn't want us to have to slug it into a boring business hotel when we just needed a place to sleep. Instead, I wanted to enjoy the summer twilight, rail-side in a country house and I managed to find just the place!: Green Hills in Nou.
This beautiful house is perched directly next to the Nihonkai Hisui Line. This happens to be a private line; but it's also the main trunk up the northern coast of Japan and the fastest route for freight from Hokkaido when trying to get further west than Tokyo. So, whilst you won't see JR Limited express trains (thanks to the Hokuriku Shinkansen), you will see trains of the Echigo-Tokimeki Railway and a parade of freight trains hauled by EF510 locomotives.
So, above was taken from the left window that you can see in the first image above. It was the 'second bedroom' on the second floor of the house. It had a beautiful view of the railway! Despite the morning rain, I managed to get up in time for the first 'daylight' freight just after 4am. Disregard the tapping in the video as it's just a leak from the gutter above, beating on the first floor roof.
The trains thundered by like clockwork... and I really need to emphasise on the word 'thundered'. Due to there being a crossover right in front of the tunnel portal next to the house, the katan-katan of the wheelsets was loud enough to actually rattle the foundations of the house.
On the next trip to Japan, I'll probably book this place for a few days and set up a temporary live camera so everyone can enjoy the sights.