Den Überblick behalten

Raspberries sind bei mir mittlerweile in Mengen im Einsatz. Sie steuern (Heizungen – mit openHAB) und messen (innen mit openHAB, aussen auf meinen Wetterstationen).

Und mittlerweile hosten sie auch die eine oder andere Webseite. Vor einiger Zeit habe ich mit Grafana ein kleines Dashboard aufgebaut. Hat überhaupt nicht weh getan.

So können wir leicht feststellen, ob eine Wohnung schon warm genug ist für die Besucher oder ggfs. auch Fehler im Heizsystem feststellen. Wirklich ein kleiner Aufwand (wenn man mal meine Lernkurve nicht berechnet). Und wirklich: in Zwingenberg ist es im Mittel 3-5°C wärmer als in Gersfeld!

Kudos to Raspberry Pi

A couple of years ago I have started to experiment with the Raspi. Well, what can you expect from a computer that costs less than 50€? I thought. But as time went by, I started to use them more and more. Now I have had 4 of them (V3) in production for quite some time, mostly to control radiators and other smart home components through openHAB, an open source control center software for smart homes.

Today I was checking on one of them and while logging in, it’s welcome message filled me with joy:

Last login: Thu Oct 22 07:38:33 2020 from 88.152.4.38
Mi 9. Dez 13:02:39 CET 2020 @ xxxxxxx
Uptime: 13:02:39 up 634 days, 23:17, 1 user, load average: 0,00, 0,00, 0,00

For almost 2 years the little guy was up and running, doing his work. Enduring internet outages and what not. I had not expected such a great performance. I hardly dare to upgrade him to the new OS version. Not for that I don’t trust the update process (I have upgraded most of them already to Buster), but because I want him to continue present me so pleasant stats!

PS: Kudos also to the folks from openHAB. While I find some functions quite cumbersome (not a JAVA friend here) I’m really happy with the RESTFul API that they provide. It has allowed me to develop some very helpful functions for automation on top of openHAB.

Keep on truckin’

Paying the Bill

When we started to get serious about our holiday apartment business in 2008, I also got serious about providing IT support for the business.

At the time, it was very important to have your own website. I think about 30-40% of the bookings came from that source. With the little experience I had back then I chose Drupal to build the site. Note: This was before folks were using (i)Pads and smart phones to check for vacation homes. So we ended up with a page that until today sucks on small screens.

Next project was to create an internal system to keep track of our bookings. So I set off to write a small PHP system to create and track visitors, bookings, invoices and so on. What I had envisioned to be a very small app turned into a much bigger endeavour once I noticed that it won’t be just me using this app. So I wanted to add a nice UI and chose a very cool and hyped framework called Angular.js. The initial system booking system went life on my provider’s servers in 2014, the Angular.js version about a year later. Angular.js is now in long-term support. And starting with the next version (2.0) Angular.js was renamed in Angular and it underwent a major redesign and paradigm shift. Needless to say that it more or less requires you to learn a new language called Typescript.

After all that I started to enjoy hacking away again. Looking at the cost structure of our business one of the interesting places where we could lower cost and do some good for the globe appeared to be lowering our heating (aka oil) cost. Around 2016 I had started playing around with an ope-source project called openhab, connecting to closed solutions to control heating radiators. I implemented a system in openhab (or actually using openhab’s REST interface to connect the booking system and the heating control system. That was and is fun! And recently I started using Grafana to visualise the performance of the system. Just to find out that the closed solution (I’m using the MAX! system) is not up to par. So I’m learning a lot about the ‘hidden features’ of MAX! and try to adopt my control center software, running on Raspberry Pi’s at home and in the holiday apartments..

For the past years I was still fully employed. In June 2020 I changed from fully employed to working a maximum of 2 days a week on consulting contracts. This allowed me to sit back and look at the technical debt that I have accrued.

What I learned is:

  • Don’t create a website if you don’t have to
  • If you really need, think twice if you need to use a CRM of if a static page suffices
  • For each and every piece of software that you develop: Never forget the cost of ownership.
  • Write tests and documentation. You will not understand what you did 6 years ago without it!
  • Speed of front-end technology changes far faster than backend: Separate the two systems as much as possible (e.g., RESTful interface)
  • It sucks to use IoT systems (e.g., smart home hard/software) that is a) closed and b) developed with a focus on the hardware). In addition, the cost of the hardware is still very high and the ROI is not coming any time soon. Rather invest into more bold solutions (e.g., heat pump based solutions, solar energy)
  • It was and is a lot of fun to play!

So I’m going to invest some time into

  • Choosing a different technology for re-writing our public website (static page, goHugo?)
  • Learning Typescript (actually started a course @ Udemy)
  • Chose the right front-end technology for the internal app (e.g., React, Angular)
  • Make plans make major investments into saving a) the planet and b) heating cost

I’ll keep you posted!

Playing with Data

In the last couple of month I’ve started to experiment with data. Won’t call it big data, because my datasets are rather small. Like visitors in our holiday apartments or feedback from EclipseCon Europe attendees. But like a real guy I didn’t want to do with MySQL, I wanted to play with the the new toys. So I installed MongoDB.

Anyway, it turned out to be more difficult than I thought. Cleaning up the data and getting them in (the same) shape turned out to be harder than I though. But help was just around the corner – coming from Udacity: A course on Data Wrangling, just what I needed. Turns out it was just the right course at the right time. And forgive me, if I don’t bore you with all the details that I had to clean up.

I’m certainly not a big data expert yet. But I start to understand how to approach my problems and what I can do with my information. And visualizing that information is even more fun. Now you can guess what that heatmap shows 🙂

Heatmap Example (based on Google Maps)

 

Screen Scraping

After my last post I was actually contacted by 2 people asking for more current information on the website that I had built. In particular, they were interested in the conditions of the winter sports facilities that we have in the region (ski-lifts, cross-country trails).

I looked around, and the only information available was on the web sites of the facility operators. No central place where all the data was collected and made available. Since I had never done screen scraping before, I wasn’t really sure what to do.

Reading up on Stackoverflow and other resources I learned that I had to read an HTML site, turn it into a DOM object and find the right places with the right information for the facilities (closed, open, good conditions, red.gif, green.gif). Looking around I found a nice helper library that served me very well with my first version: For every webpage to get data from I wrote a little PHP script to capture the data.

This worked well for the first facility, where the website was quite responsive. The second one was making more trouble with regard to response times. Now I had a 6 sec wait before my page was displaying. That wasn’t really acceptable, because I have still 2 more places to scrape.

gersfeld-ski

So I took the Saturday afternoon to make it work asynchronously. It turned out to be quite easy: I continued to use my PHP scripts, but converted them into functions that could be called with AJAX calls, returning JSON data. From there it took only a couple more minutes, and I was finished. Displaying the site itself is really fast again, and since the scraped information doesn’t show up in the visible part of the browser things can take a little longer. But even scrolling down right away is fun: I enjoy watching the data show up!

Ch’ti

So tonight I was heading to northern France, to present at the Java User Group in Lille. My idea was to fly into Bruessels and take a car from there to meet the folks around 18:00 at the meeting location of the Ch’ti Jug to talk about Eclipse and such.

Now it turns out that this was a really bad idea. The plane I took from Berlin to Bruessels: Well it started late. The excuse the pilot made was that his cabin people had miscounted the number of people on the plane. So they unloaded some luggage, then they found out that the people were actually on the plane, then they re-loaded the luggage.

Execution cost some time, so we started about 45 minutes after the planned departure time. Oh wee, I thought, good that I had planned for some extra time.

Turns out that the traffic jams around Brussels were not in my calculation. They ate up all the buffer that I had planned. But there was a chance! My little TomTom navigation app on the iPhone was telling me that I will be only 3 minutes late. Little did it know!

Just 20 kilometers before Lille my For rental car gave up. No comment, it said.

So what’s left: I can only apologize to folks in Lille. If they still want me, I’ll be back!

Visiting Ancient Sites

Last week, I had to go to Italy for some an Eclipse meeting in Naples and then another one in Florence. Departure to Rome on the Sunday and the May Holiday on Tuesday gave us a chance to visit a couple of places before, between and after the business meetings.

 

It started off on the Monday with a visit to the Forum Romanum. Having learned Latin for many years in school, the place is sort of familiar, and it’s always fun to visit. And not only this, I actually like to see ancient things a lot more than the religious places, which are usually overladed with the symbolisms of a belief I’m not really keen on. It’s actually true for the pagan religions of the ancient times as well, but there I can ignore it.

Next was the Colosseum, and  this time we actually decided to stand in line and pay the fee of € 7 to get in. It took us about an hour to walk around, and that was time well spent! Saw a lot, learned a lot. The Colosseum has an onsite walk that explains the facility as well as the different building stages and the life in a day of a Roman while attending the games.

It is pretty amazing to wander around this place and imagine that down in the arena people were fighting for their lives while on the seats families where having their meals warmed up on open fires, playing with their children and doing beauty maintenance. That’s at least what can be deducted from the items found in the sewers.

Next stop was Naples, where we had May 1st to visit both Pompeji and Herculaneum. Both of them got destroyed in the eruption of Mount Vesuvio in AD 79. While I had visited Pompeji before, it was very interesting to visit Herculaneum, a much smaller site that is also less frequented by tourists. And you actually have a chance to stroll through the modern town attached and have a normal coffee or beer 🙂

Different to Pompeji Herculaneum provides a good look at the structures and buildings, as they were not as destructed as in Pompeji. The picture below shows a look at a Roman fast-food restaurant.

Again on this part of the trip my Latin lessons came back, and I’m looking forward to reading the letters that Pliny the younger sent to his friends, describing the events in AD 79.

As a side note: We staid in an old and quit hotel close to Pozzuoli, called Delle Therme. Completely outdated, but has a lot of charme if you can live with ancient beds:-) And the best: We ran into a photo shooting where we made the actress pose for us in her 60’s outfit.

Next stop was the Eclipse Day Florence. The event was very well organized, and the line-up of speakers was great. But I think that might be easier if you have to offer a location like Florence.

Before this post gets to long: Florence is great, I will go back and take a couple of pictures there!

My traveling compagnon Mike visited Italy for the first time ever.  When we departed in Rome he told me that he was really impressed by Italy.

And guess what: The food was great in all the places we went, so my scales were the only onces who didn’t appreciate Italy.

Open Source Think Tank (Thursday)

Day One turns out to be very interesting, despite of being pretty tired from jet lag.

Earlier today a panel discussion how communities should be managed and treated with a lot of insight from very experienced community managers, then introduction to GENIVI project and initial workshop work. While most of the questions we are supposed to work on have obvious and simple answers (“YES”, “NO”), developing the reasoning in the group shows a large bandwidth of experience and opinion.

The keynote of the day came from Chris Vein, who is CTO for Innovation in the Executive Office of the President. The most interesting talk was on Open Governance, what the different departments are doing to innovate the way they are serving (and want to serve in the future) the individual citizens of the country. He gave a couple of examples from NASA to the department of food and drugs how open source empowers the government agencies. I hope that the talk will be available for public consumption soon. It will certainly give other organizations an idea how far OSS is already spread throughout the governments of the world.

Now I’m listening to the next case study presented by the U.S. Department of Veteran Affairs – an open source project comprising a full-blown system for large hospitals.

Looking forward to more!

this-tor-node-is-causing-you-grief

In the last couple of days I experienced strange visitors on our holiday apartment site. They came through a TOR network and were trying to create users on the page. Apparently they understood that it was a Drupal site, because they had the right URL and everything.

I had never heard of TOR before, so I’m quite amazed to see visitors from the dark side of the internet. On the other hand, its a pain-in-the-butt to delete the 30+ users that they create on a normal day. Does anybody out there have experience with these types of attacks?