I am learning a whole lot more about the constrained environment of the microprocessor. For a start, you don’t have a Linux command line! And, perhaps more insidious, you do not have much memory.
Platforms such as Arduino or Pyboard have very restricted memory spaces, and in essence you ‘sit’ within the Python interpreter and run commands there. Actually it is not fully like this, some implement a REPL or some implement a server that takes FTP or Telnet commands, but it is very different from working even in a small single-board computer such as a Raspberry Pi or Odroid which have full Linux, command line shells such as Bash, and even desktops/browsers.
What happened was that I was getting along well with my project displaying weather patterns on the e-ink device and started retrieving more and more details on the controller, however after a short few iterations got memory errors. Thinking this just to be a problem of garbage collection, I called that method and got better iterations – but still eventually crashed out. It seems that the Json parsers and whatnot are leaking memory and no amount of garbage collection will help – and the direct calls to URLs for the weather services surely isn’t helping!
I’ve now split the code into two components: a server component most likely running under NodeRED on one of my SBCs, and the display component which will simply take the weather readings from the server and display them, concentrating on what it does best.
I may also get advantages by using this method as it can run on smaller embedded devices such as the ESP8266 which do low-power much better than the fuller devices such as the WiPy2. Who knows, maybe months of displays without resorting to replacing the batteries?
I started musing about the complexity of modern computer systems after talking with a colleague about recent computer crashes. Those not familiar with modern computer systems expect them to work better than they do and seem to take delight in handing out blame where none lies. Having worked for decades in computers both large and small I am astounded by a couple of things: that complex computer systems actually work pretty well, and that they don’t crash more.
Humans seem to naturally enjoy reducing things to black and white, or two or three choices. Listen to the evolutionary biologists and they’d claim all sorts of things to prove this, such as living in caves and having only a few foods to eat or whatnot. I don’t entirely believe them either – they sound too much like looking for the hypothesis in the evidence, rather than the other way around.
In my professional life I have been through perhaps 3 or 4 major incidents where massive numbers of people could have been impacted and each time a very focused and dedicated technical team have averted disaster. These things are not easy and take real hard work. I’d rather rely on people who have experienced real complex issues than those armchair generals that never saw a battle.
Case in point: you’d perhaps expect that major companies know exactly what systems they have installed, how many applications run on them and where they are? You’d be wrong, as time and time again I hear stories or know by personal experience that they do not know entirely where things are. This varies from the incidental (we didn’t know that we were running that much software) through to the monumental: a major corporation thinking it had about 8,000 systems and finding that they had … 14,000. That is quite a difference from what they expected.
Putting aside our human tendency to over-simplify, what happens as systems get even more complex? Can we get computers to manage other computers, can we adopt methods and architectures which are much more emergent rather than the procedural and directive that we currently use? What happens when the system becomes more than the sum of its parts and can’t be governed except by itself?
Interesting discussion on a internet forum about job redundancies and how heartless large corporations are in dealing with employees. It does make me wonder who is wrong here: the expectation of employees that they will have a ‘job for life’, or that capitalism rides on the tides of the market and a fundamental aspect of that, is that supply answers demand. Redundancies are built into the system.
Of course there are reams of tomes written to describe the good and bad of economics, and I don’t think command-and-control economies are any better, nor those built on class systems or corruption. Drug- or oil-economies aren’t much better (unless you are Norway and put it into a long-term state fund) and I don’t admire those economies which allow some members to have wealth because of their family connections. No, I am not a monarchist.
What I do admire is a level playing field where all are given the same start to life, the same opportunities, and the same incentives to develop their full potential. Offering to let them become part of the ruling elite, for example, is a straw-man argument: privileging the few at the expense of the many simply extends the injustice into the next generation.
Egalitarianism has benefits, both for those at the top of society and for those at the bottom. My belief is that societal expectations have to change into a more fluid model that sees expert plumbers at the same level as philosophers (or bankers, lawyers, doctors) and rewards effort and dedication rather than connection and background. Organisational models need to change as well so that companies no longer have the protection of ‘personhood’ under the law, and plain old hierarchical organisations adopt to a more fluid view of entering and leaving that organisation.
I’m reading “Reinventing Organisations” by Frederic Laloux and taking an interest in the concepts of Holacracy – although I’d have to admit that a lot of things get hyped as the new silver bullet and touted around the management training circles simply as a way of earning money for those management consultants. I’d wait for a couple of centuries of real-world testing before suggesting that any of these are more than just fads.
I’ve recently been trying to get the Plex media centre running on an ARM machine. I had previously experimented with Kodi as a media server, and while I could get it running and serving from a variety of devices including a Raspberry Pi 2 machine as the playback, making it all work for my family seemed too much Heath Robinson-esque with computers stuck to the walls of my daughter’s bedroom and long lectures on complex controls.
It may be helpful at this point to understand some of the main differences between the two software alternatives for home theatre PCs: Kodi and Plex. While there are others, these are the common choice. While they had a similar beginning and may share some code they do operate in different ways: Kodi is a client, while Plex runs on a server. With both the actual recordings can be held anywhere, and neither will record TV without extra plugins.
A table taken from HTPCBeginner.com may help:
|Database of movie information
||On the client
||On the server
||An app or application running on your client device – computer, phone
|Transcoding – playing the recording onto your viewing device
||Handled by client
||Needs a beefy server
|What client hardware can be used
||Lots and lots
||Very flexible, and necessary!
||Free and subscription models
So the choice comes down to the open-source, configurable Kodi or the simple to install and standardised Plex. More than their development models I appreciate the difference in their approach: one works out of the box, the other takes some configuring but is very flexible.
Having got Plex running under Exagear on my Odroid quad-core C2 server, I then ran into the problem of the CPU power required to do ‘transcoding’. Transcoding is essentially converting the recorded film into a format that can be displayed on the client device whether a mobile phone, web browser, or TV. The formats in which films are recorded is a whole subject in itself and changing them from one format to another is a difficult process. My little single-board computer could not cope.
Even the old Intel Atom system I used to run could not cope – apparently I need something called a passmark of over 2000 and although my quad-core server could perhaps do the work, the Plex software sometimes decided that it couldn’t. So I’ve started looking for a low-powered, small and silent system with enough processing power that it can happily transcode all my movies. Apparently some NAS (Network-Attached Storage) boxes can do it, however the price point of those is a significant purchase and not something I want to spend on entertainment. I’ve looked at AliExpress and there are some very good i5 mini-PCs available and I may go that route.
I’m interested in converting my house sensor network into one running on LoRa.
The background to this is the sad demise of the Nottingham company WirelessThings from whom I had purchased a number of house sensors. These were small devices transmitting over 868MHz with a variety of ‘personalities’ including temperature, flood, magnetic, on/off, light and so on. I had them scattered around the home, including a water detector on my back fence where the seasonal flooding from the local river first enters my property.
The head end was a simple USB stick plugged into one of my SBCs running NodeRED. This collected incoming radio signals using a text protocol called ‘LLAP’ and converted them to MQTT, sending them on to a variety of end points such as alerting my mobile phone when floods were coming in. As the devices went into a configurable sleep mode, the button batteries lasted years without replacement. I even had a separate NodeRED flow that handled battery level to alert to replace batteries. They were useful in a number of ways:
- knowledge of battery state
- ability to ‘talk back’ to a sensor
- transmission over relatively large distances
- simple, configurable and multiple sensors using a small form factor enclosure
- different sensors encoded values into a similar-length text string (12 bytes)
All that has finished with the closing of that company, so I have been toying with using a LoRa gateway from someone like Pycom, or purchasing a full LoRa gateway and connecting this to The Things Network and thence back to my servers. Cost is playing a significant part as I can’t really afford three- or four-figure sums for my electronics hobby. The Pycom LoPy running MicroPython looks like a good alternative (even if it cannot act as a true LoRa gateway due to lack of multiple channels) so I may wait for that to stabilise and then work from there.
I think we are going to hear an oft-repeated phrase in the next five years. That phrase will be “because we left the EU…”. I’ve just encountered it at a soft-furnishings store where the price increase on Egyptian towels was explained by the salesman who said “because we’ve left the EU the prices of cotton have risen, so we had to put up the price of towels”.
- But the UK hasn’t yet left the EU,
- suppliers in Europe cannot charge the UK different prices than other EU countries before the UK leaves the Common Market,
- … and Egypt, from where the ‘Egyptian cotton towels’ come from was never part of the EU anyway.
What’s going on here? It can’t be only the exchange rate as cotton and other commodities are purchased on a futures market so as to avoid vagaries in price.
I’ve a feeling that this will be just the start of this saga. Retailers by their very nature often have very thin margins and are at the mercy of the whole supply chain. Disintermediation in the form of eBay or Amazon who sell direct to the consumer have cut their revenues, and the Common Market meant that I could by parts for my aquarium filters at one-third to a half of local UK prices. And that is while we were in the EU! With the upheaval in politics represented by the EU vote a whole lot of things will now be blamed on this rather than the more prosaic explanation that sellers make profit whenever they can.
I had an epiphany after a few weeks of working on this weather calendar. It seems that the routine which works on CPython implementations (such as those running on Linux, Mac) and drives the ESP01 ESP8266 chip well, does not work so great when running on the embedded MicroPython.
I’d used a library which worked really great when operating the display remotely with the ESP01 driving it. That formatted the commands correctly and got the display to respond well. However on running this on the various MicroPython devices such as the Adafruit Feather Huzzah or the WiPy the display refused to respond, and on reading the Rx pin was sending “Error:20” back on each command.
The epiphany occurred when I saw the number of characters written to the output Tx pin:
A5000900CC33C33C - string
b'\xa5\x00\t\x00\xcc3\xc3<' - unhexlify of string
¥ Ì3Ã<¬ - string through H2B routine
b'c2a5000900c38c33c3833cc2ac' - hexlify dump of that
13 - size of output written to Rx pin
That second last line should have been ‘9’, as that was the packed byte length of the command plus parity bit. For some reason the routine in that library called H2B was putting an extraneous ‘c2’ at the start, plus some extra insertions in the middle and end.
I investigated and it seems that the binascii module will provide some of the same function, in particular the hexlify/unhexlify calls which pack hex representation into bytes, just want I want. Luckily those two calls are implemented in MicroPython, although others are not. If I can iterate the string and build a parity bit okay then I think I can build the correct commands needed by the display and drive it from a local Huzzah or WiPy device!
Still not out of the woods yet, by asking on the MicroPython forum why the leading ‘A5’ was converted into two characters it was helpfully pointed out that in hex that was 165 decimal, and I should use a byte string instead. That worked!
Last issue is how to convert between string commands which need parity bytes, and encoding these as byte strings without conversion and I will have cracked this one.
The Adafruit Feather range is a very nice set of development boards from the NYC company. They are a good form factor (approx. 5x2cm) and stack using appropriate headers. I particularly like the Feather Huzzah and initially went for these in a big way as they fulfil a number of criteria:
- they can run MicroPython,
- the processor boards are wifi-enabled,
- and some added benefits like lots of additional addons (RTC, OLED, 7-segment displays) plus great tutorials on their web site.
Initial and ongoing experience proved that they were very easy to work with, and they worked as expected. However, the ESP8266 wifi-enabled board does not go into a really low-power mode, consuming about ~70mA normally and the deepsleep mode requires manual intervention. That will most likely mean that I can’t use it longer term for the driver on the display, but it will prove useful as a learning exercise on the way down the power ladder.
Using Arduino IDE
I initially started by loading MicroPython onto the board using the esptool flasher, but ran into an issue that I could not seem to find the correct pinout for the Tx/Rx. Using the ones marked on the board interferes with the USB – serial controller and you see spurious things on the link. So I backed off and re-flashed it with the Arduino IDE and a neat little bit of example clock code which works well, proving that the Tx pin works at least, and the battery powered Feather Huzzah can indeed drive the e-ink display.
However, using micropython proves more difficult, as I cannot seem to find the correct Tx pin for UART 1. Having flashed the 1.8.4 code level onto it using the instructions I initially tested UART(0), but as this is connected to the USB-serial chip you get all the USB traffic on those pins and so I looked to the write-only UART(1). I could not find which physical pin this was attached to even after trying all the pins one by one.
The ESP8266 is an incredible little module produced by Expressif. It combines a little processor with a WiFi chip that can act as both a client to an existing wifi network, or as an access point to create its own network. At around $2 it is as “cheap as chips”!
I saw an excellent example of using a variant called the ESP-01 to run a remote e-ink display, along with the code to run this in Python. Whilst it was very attractive and had lots of function, it did not meet some of my earlier requirements of being low-powered as the ESP8266 chips still require about 70mA of current to run. So I could not run a battery powered system on this for long, although the simplicity of the setup surprised me in how easy it was to get the demo program working.
I used a few websites to get started, preferring to connect to the ESP-01 using a CP2102 UART – USB module which I already had (and could set to either 5V or 3.3V – the ESP8266 uses only 3.3V). Once connected using the appropriate bits of wire and remembering that Tx-Rx and Rx-Tx on either board, I then had to set the ESP-01 into station mode, connect to my wifi locally, and start the server. Commands were roughly:
- AT+GMR, to get the firmware version
… but see the sites referenced to better understand these modem commands and how the embedded server works. I like this chip and I’m going to order a few more just to have if I later want to wifi-enable more projects.
I earn a fairly decent salary. Using the http://globalrichlist.com shows that I am in the top 1% richest people in the world by income. So things should be okay, right?
Well, no. Each year I struggled to makes ends meet. I budgeted rigidly and allocated all my spending into a number of buckets. I had a spreadsheet with lots of columns and measured my electricity down to the kilowatt. I had multiple bank accounts for different categories and put money into each of them after it arrived in my main account. Yet each year I seemed to be worse off.
Where did all the money go? I don’t smoke, drink heavily, nor gamble. Reducing subscriptions to only two magazines and a programme of reduction in insurance costs helped a little. Getting rid of fixed telephones and using VOIP through my broadband was great. Curtailing long holidays overseas helped but even without taking any holidays away from home for five years it did not stop the slide. If the motto of thrift is ‘to live below your means’ I obviously wasn’t getting it right.
Until I started using the excellent YNAB program I didn’t know how much I spent on different categories – for example I now know that I spend around £40 per month on electronics. That may sound high but it includes single-board computers like the Raspberry Pi and controllable lighting which reduces my electricity. As this is almost my only hobby except for reading I don’t think it is so bad. But until I recorded everything I spent over a year I had no idea it was so much! What YNAB has done is allow me a way to allocate every penny I earn into a budget category and then record each and every purchase against one of those categories throughout the year. It wasn’t even painful as I used my phone app each time I spend money.
Actually, maybe I am a whole lot better off than a lot of people, and should be thankful for what I do have – my salary is better than the average UK salary of around £25,000 and many other people live in more straitened circumstances. While I do support 4 adults off that income, I also get to spend on some things that bring me pleasure, and enjoy watching three other people live life to the fullness of their ability.