WordPress und Google Fonts

Google Fonts sind schön. Und so einfach einzubinden. Mittlerweile gibt es ca, 1.500 Schriftarten, die Google lizenzfrei und kostenlos zur Verfügung stellt. Viele WordPress Themes laden sie munter von Google Servern nach. Darüber hatte ich mir nie Gedanken gemacht. Bis ich vor kurzem von einem netten Herren darauf hingewiesen wurde, dass durch das Nachladen der Fonts von Google Servern Daten des Website-Benutzers (IP Adresse) an Google weiter gegeben werden.

Schritt 1: Identifizieren der Fonts – Durch Verwenden der Entwicklertools in den Browsern lassen sich die Fonts im Bereich Netzwerkaktivität schnell finden. Die Vorgehensweise hierfür wird sehr detailliert z.B. hier beschrieben.

Schritt 2: Lokale Bereitstellung der Fonts

Die anspruchsvolle Lösung ist das manuelle Speichern der Fonts und das Ändern der CSS Dateien in der WordPress Installation. Kann ich nicht umbedingt empfehlen: Dazu sollte man sich schon ganz gut im Code auskennen.

Die einfache (und von mir präferierte) Lösung ist ein WP Plugin namens ‘Local Google Fonts‘. Nach der Installation und Aktivierung des Plugins werden die verwendeten Fonts gesucht (und gefunden). Dann können die Fonts auf den eigenen Server geladen und verwendet werden. Nähere INfos zur Vorgehensweise finden sich hier.

Jetzt ist die Seite wieder ein bisschen mehr DSGVO konform.

Weiterführende Links:

DNS Server – Schneller und Privater

Bin heute beim Lesen über einen Artikel bezüglich verschiedener DNS Server gestolpert. Bisher habe ich mir darüber nicht so viele Gedanken gemacht und immer die Vom Internetanbieter zugewiesene DNSv4-Server verwenden (empfohlen) – Text in der FritzBox – stehen lassen. wie ich dann aber beim Lesen dieses tollen Artikels von Kuketz IT-Security herausfinden musste, war das keine so gute Idee. Nack Kuketz werden die DNS Anfragen von vielen Provider DN Servern geloggt und entprechend ausgewertet. Kein schöner Zug von den lieben Provider!

Ich habe jetzt in der Fritzbox die DNS Server von Digitalcourage und dismail.de eingetragen. Neben einer messbaren Geschwindigkeitssteigerung werden nun die DNS Anfragen aus meinem Heimnetz hoffentlich nicht mehr geloggt.

Probiert’s mal.

Was lange währt

Endlich habe ich es geschafft, unsere alte, auf Drupal 7 basierte Webseite auf etwas Moderneres zu heben. Die Anforderungen waren/sind:

  • die neue Webseite muss vom Smartphone bis zum großen Bildschirm lesbar sein (Responsiv)
  • unsere Belegungskalender müssen sichtbar sein
  • unser Bildmaterial muss erneuert werden und die Texte müssen angepasst werden
  • die Seite muss gut administriebar sein
  • das unterliegende System soll einfach Mechanismen zum Update bereitstellen

Was wir nicht mehr wollten:

  • Zu viele Seiteninformationen wie z.B. Restaurantführer
  • Abhängigkeit von Dritten zur Kalenderdarstellung (wir haben die letzten Jahre Kalender von ferienzentrum.de eingeblendet)
  • Harte Technologiesprünge wie bei Drupal V7 -> v8 o.ä.)
  • Zweisprachig: In der Zeit von deepl.com muss das nicht mehr sein.

Optionen waren:

  • eine statische Webseite (etwa Hugo). Das habe ich lange überlegt
  • CRM System (Drupal, WordPress)
  • Python / Javascript basierte Webseite

Letztendlich haben wir uns für WordPress entschieden. Es gibt eine Vielfalt Themes, die wir leicht an unsere Bedürfnisse anpassen konnten. Bilder sind einfach zu organisieren und via Galerien und Lightox präsentierbar. Die Einbindung von Google Calendar wird mit einem Zahl-Plugin sehr einfach. Die größten Probleme (oder Verständnisprobleme) hatte ich mit der Google Cloud Console, Die Editoren sind verständlich und auch für Nicht-Techies benutzbar. Am wichtigsten erschien mir, dass die WordPress Community ihr Geschäft versteht und es uns Benutzern einfach macht, die Seite auf den neuesten Versionen zu halten.

Gerne hätte ich die Seite auch mit Python /JS gebaut. Gescheitert ist das Vorhaben an der Verfügbarkeit von Python bei meinem Hoster. Das ist zumindest meine Ausrede. Ebenso wie bei Hugo hätte ich nämlich viel CSS und Design selber machen müssen. Das ist nun nicht so meine starke Seite …

Hier sind die Dinge, die ich noch tun möchte:

  • eine Buchungsleiste einfügen: Der Besucher gibt Anreise-/Abreisedatum ein und erhält als Resultat der Anfrage eine Liste der buchbaren Apartments für diese Zeit mit aktuellen Preisen, ähnlich wie bei booking.com
  • aktuelle Wetterinformationen, die ich über unsere Wetterstation sammle (siehe auch https://www.wunderground.com/dashboard/pws/IGERSF1)
  • Bilder von unseren Webcams

Link zur neuen Seite: https://ferien-in-gersfeld.de

Kudos to Raspberry Pi

A couple of years ago I have started to experiment with the Raspi. Well, what can you expect from a computer that costs less than 50€? I thought. But as time went by, I started to use them more and more. Now I have had 4 of them (V3) in production for quite some time, mostly to control radiators and other smart home components through openHAB, an open source control center software for smart homes.

Today I was checking on one of them and while logging in, it’s welcome message filled me with joy:

Last login: Thu Oct 22 07:38:33 2020 from 88.152.4.38
Mi 9. Dez 13:02:39 CET 2020 @ xxxxxxx
Uptime: 13:02:39 up 634 days, 23:17, 1 user, load average: 0,00, 0,00, 0,00

For almost 2 years the little guy was up and running, doing his work. Enduring internet outages and what not. I had not expected such a great performance. I hardly dare to upgrade him to the new OS version. Not for that I don’t trust the update process (I have upgraded most of them already to Buster), but because I want him to continue present me so pleasant stats!

PS: Kudos also to the folks from openHAB. While I find some functions quite cumbersome (not a JAVA friend here) I’m really happy with the RESTFul API that they provide. It has allowed me to develop some very helpful functions for automation on top of openHAB.

Keep on truckin’

Paying the Bill

When we started to get serious about our holiday apartment business in 2008, I also got serious about providing IT support for the business.

At the time, it was very important to have your own website. I think about 30-40% of the bookings came from that source. With the little experience I had back then I chose Drupal to build the site. Note: This was before folks were using (i)Pads and smart phones to check for vacation homes. So we ended up with a page that until today sucks on small screens.

Next project was to create an internal system to keep track of our bookings. So I set off to write a small PHP system to create and track visitors, bookings, invoices and so on. What I had envisioned to be a very small app turned into a much bigger endeavour once I noticed that it won’t be just me using this app. So I wanted to add a nice UI and chose a very cool and hyped framework called Angular.js. The initial system booking system went life on my provider’s servers in 2014, the Angular.js version about a year later. Angular.js is now in long-term support. And starting with the next version (2.0) Angular.js was renamed in Angular and it underwent a major redesign and paradigm shift. Needless to say that it more or less requires you to learn a new language called Typescript.

After all that I started to enjoy hacking away again. Looking at the cost structure of our business one of the interesting places where we could lower cost and do some good for the globe appeared to be lowering our heating (aka oil) cost. Around 2016 I had started playing around with an ope-source project called openhab, connecting to closed solutions to control heating radiators. I implemented a system in openhab (or actually using openhab’s REST interface to connect the booking system and the heating control system. That was and is fun! And recently I started using Grafana to visualise the performance of the system. Just to find out that the closed solution (I’m using the MAX! system) is not up to par. So I’m learning a lot about the ‘hidden features’ of MAX! and try to adopt my control center software, running on Raspberry Pi’s at home and in the holiday apartments..

For the past years I was still fully employed. In June 2020 I changed from fully employed to working a maximum of 2 days a week on consulting contracts. This allowed me to sit back and look at the technical debt that I have accrued.

What I learned is:

  • Don’t create a website if you don’t have to
  • If you really need, think twice if you need to use a CRM of if a static page suffices
  • For each and every piece of software that you develop: Never forget the cost of ownership.
  • Write tests and documentation. You will not understand what you did 6 years ago without it!
  • Speed of front-end technology changes far faster than backend: Separate the two systems as much as possible (e.g., RESTful interface)
  • It sucks to use IoT systems (e.g., smart home hard/software) that is a) closed and b) developed with a focus on the hardware). In addition, the cost of the hardware is still very high and the ROI is not coming any time soon. Rather invest into more bold solutions (e.g., heat pump based solutions, solar energy)
  • It was and is a lot of fun to play!

So I’m going to invest some time into

  • Choosing a different technology for re-writing our public website (static page, goHugo?)
  • Learning Typescript (actually started a course @ Udemy)
  • Chose the right front-end technology for the internal app (e.g., React, Angular)
  • Make plans make major investments into saving a) the planet and b) heating cost

I’ll keep you posted!

Meshing it up

Ich kann wirklich nicht erklären, warum es so lange gedauert hat. Aber endlich habe ich mein Gersfeld Mesh-Up Netzwerk am Laufen. Was war das Problem: Wir haben eine Location, bei der vom Provider einfach nur furchtbares Internet geliefert werden kann (sagt der Provider). Die Idee war für eine ganze Weile, dass wir die Bandbreite der anderen Locations (andere Verträge, andere Anschlussmöglichkeiten) koppeln und dann auch bei der unterversorgten Location gute Bandbreite zur Verfügung stellen.

Die Hardware hatte ich schon seit Wochen (Monaten) herumliegen. Heute habe ich sie endlich zusammen gebastelt. Und siehe da: Es ward Licht.

Leute, die Zeit von Freifunk ist noch nicht vorbei. Mesh Networks sind eine Riesen Chance für kleine Orte wie unsere. Jetzt müssen wir nur noch schauen, wie wir alle Punkt mit Linien verbinden!

 

The sharing economy: Uber will stop ignoring rulings

Just came across an article in a German internet news portal. The headline claims that Uber won’t ignore official rulings anymore. That is an interesting statement. It makes it clear that Uber thought in the past it is above all and everybody. And it leads to another thought: At least here in Germany they had to stop the bullying and behave. I really appreciate that the German cities have started to take a stand and show the Überflieger that the same rules apply to everybody. The Germany manager of Uber, Christian Freese sounds now like the wolf who ate chalk. He even claims that Uber is trying to cooperate with the traditional taxi firms.

UberBlack has been prohibited in Berlin, and UberPop has been prohibited Germany-wide.

And while we talk about it: Berlin has taken a stance and is reclaiming rental space from AirBnB and similar companies. But more on the friends-rent-to-friends companies in a different post.

Some links to recent Uber articles:

Playing with Data

In the last couple of month I’ve started to experiment with data. Won’t call it big data, because my datasets are rather small. Like visitors in our holiday apartments or feedback from EclipseCon Europe attendees. But like a real guy I didn’t want to do with MySQL, I wanted to play with the the new toys. So I installed MongoDB.

Anyway, it turned out to be more difficult than I thought. Cleaning up the data and getting them in (the same) shape turned out to be harder than I though. But help was just around the corner – coming from Udacity: A course on Data Wrangling, just what I needed. Turns out it was just the right course at the right time. And forgive me, if I don’t bore you with all the details that I had to clean up.

I’m certainly not a big data expert yet. But I start to understand how to approach my problems and what I can do with my information. And visualizing that information is even more fun. Now you can guess what that heatmap shows 🙂

Heatmap Example (based on Google Maps)

 

A Ninja is Dead

I’ve been one of these folks who bought a Ninja Block pretty early after they appeared. This was in the time before I knew much about Arduinos and Raspberry Pi’s. My block turned out to be fun to play with: I could read humidity and temperature and stream my webcam. Through its nice little REST interface I had a chance to link all this into my small web universe.

Anyway, I liked it so much that I wanted to buy another one for another location, again with all the sensors you need. Turned out that everything is sold out, and the Ninja folks say that they are planning the next big thing – the Ninja Sphere. And even small things like a temp sensors continue to be sold out 🙁

I don’t know when the announcement for the Sphere started, but by now it feels like a year or two. Nothing has happened since then, at least nothing that I can see. No release dates, just a link to a weird release plan in one of their blog posts.

Too bad. I really liked to type in the ninja URL.

Screen Scraping

After my last post I was actually contacted by 2 people asking for more current information on the website that I had built. In particular, they were interested in the conditions of the winter sports facilities that we have in the region (ski-lifts, cross-country trails).

I looked around, and the only information available was on the web sites of the facility operators. No central place where all the data was collected and made available. Since I had never done screen scraping before, I wasn’t really sure what to do.

Reading up on Stackoverflow and other resources I learned that I had to read an HTML site, turn it into a DOM object and find the right places with the right information for the facilities (closed, open, good conditions, red.gif, green.gif). Looking around I found a nice helper library that served me very well with my first version: For every webpage to get data from I wrote a little PHP script to capture the data.

This worked well for the first facility, where the website was quite responsive. The second one was making more trouble with regard to response times. Now I had a 6 sec wait before my page was displaying. That wasn’t really acceptable, because I have still 2 more places to scrape.

gersfeld-ski

So I took the Saturday afternoon to make it work asynchronously. It turned out to be quite easy: I continued to use my PHP scripts, but converted them into functions that could be called with AJAX calls, returning JSON data. From there it took only a couple more minutes, and I was finished. Displaying the site itself is really fast again, and since the scraped information doesn’t show up in the visible part of the browser things can take a little longer. But even scrolling down right away is fun: I enjoy watching the data show up!