Thoughts about the Metaverse

Metaverse is increasingly trending since Mark Zuckerberg announced (Oct 28th 2021) both the rebranding of Facebook to Meta and the next big thing, the “Metaverse”

As much as I enjoy seeing technology maturing, being democratized and becoming accessible, I also want to stay realistic at the same time. Some reflections about the current hype or the next evolutionary step in human interconnectedness.

Photo by Lucrezia Carnelos on Unsplash
  • The Metaverse emphasises on VR and AR as medium to immerse yourself. VR has seen several waves of adoption since 1970, growing from research lab exclusive use to a mass consumer product. But until today, the general adoption has not grown significantly outside the gaming and simulation niche.
  • While several expensive high-end headsets have been released and announced to enterprise customers (Varjo, Pimax, XTAL,..) there is not much in the consumer space, the Quest 2 was released in 2020 (overview). Though everyone suddenly is working on something (Apple,..). If the Metaverse is the next internet accessible by everyone, we need to have devices as cheap as mobile phones. And NO, Google Cardboard is not an option.
    AR has still long way to achieve mixed reality with seamless embedded information. In 2020 AR disappeared from the Gartner hypecycle in 2020, even predicting enterprise adoption in 2021 (didnt happen?).
  • The human bioware is not being updated. Newer VR devices are getting better, more lightweight, higher resolution, less latency etc., but VR fatigue and VR sickness are still an issue. Though you can get used to it but it still will affect the adoption. You choose the wrong environment or platform to get started into VR and it spoils your first experience, you might leave for good. I know few people being “in VR” for more than 1 hour regularly.
  • Believing this is the next step in the evolution, why should we solely rely on the company META, their potential influence on behaviour and opinion will grow further. Right now, the industry should discuss standards for seamless interopability, security and data exchange, ensuring the Metaverse will not become a separate, propietary internet, but an accessible communication and sharing platform, like the internet itself in its beginning. If we would had a proprietary approach in the 1990s, HTML would not be readable today, rather a binary blob to open in the browser, open source might not be as widespread as we see it today. The Metaverse must be open, no matter what hardware or platform is used to access it.
  • META has not yet released Horizon Home, the video material we see is conceptual work and visions (‘Not actual images. Images are strictly for illustrative purpose only.’), solely the Horizon Workrooms are available as beta (at the time of writing this post), and only compatible with the Quest 2 (don’t even works with Rift S). You can use flat screen access though, which makes little sense to me. The Quest 2 will not be able to render the illustrative concepts, except could stream hig-end rendered content.
  • The same time NVIDIA comes with their take of the Metaverse toolset, Omniverse, but with existing products and plugins and a tangible roadmap.

Conclusion:

  • Lets’s stay excited, but realistic. Embrace innovative ideas to come.
  • Ensure it will be the Open Metaverse.
  • Do good and avoid evil. Not implementing the dystopian future depicted in the referenced literature (Snow Crash and others)
  • I am eager to try, experiment and pilot. Especially in the enterprise context, there are use-cases for Digital Twin, Simulation and Collaboration which make sense and will benefit.

Recommended reading:

Google Trends

Thin Client Revival for Generated Art

Part 1 – Hardware

I am experimenting with generated art once in a while for a couple of years now. It allows me to cross the barrier between coding business systems and the world of art, literally creating software that serves absolutely no sincere business value but creating artistic enjoyment. Using the Processing environment (/library/programming language) it is amazing what fantastic visuals you can produce with little code. Note, Processing runs in its 20th year now, long time before we got into the current hype of AI generated art using GAN‘s (Generative Adversarial Network) etc and people making money with NFT (Non-Fungible Token). To be precise, Processing is more a tool for procedural art, good old algorithm creating visuals spiced up with randomness or picking up external actors (e.g. webcam). Today I wont discuss NFT’s or if it makes sense to buy a JPG file for millions of dollars, nor will I talk about GAN art based on deep learning, like Style transfer and similar (another post will cover that).

How to make generated art accesible to an audience outside the browser? With traditional means we would print the art piece, frame it, hang to the wall. This will limit us to static pieces, but we aim for the creation process and animated pieces as well. I started to work on a setup that runs as an art installation using screens and projectors, people in a public space can observe and witness the process of a piece being created or interact with it. I like the uniqueness of each visual using some kind of randomness as parameter. Whatever you see will disappear forever once the screen moves on (provided no screenshot or print created), the exact same thing you will not see again, though very similar creations coming out of the same alogorithm.

Lets look at the hardware. How to do this with little money ? We need a CPU, an OS, a screen and a stand.

Thin Client

Lets revive thin client hardware that you find for a few dollars on Ebay, usually devices which spent their previous life in an ATM, POS or behind a Checkin-Counter at an airport. Once retired after a few years this kind of equipment gets recycled or find its way into the electronics second-hand market (and hopefully not in landfills or recycle yards in Africa). Using Linux as OS we can use most thin clients built after 2010 with 64bit architecture (32bit no longer supported by Debian based systems), with 1 or 2GB RAM and at least 8GB diskspace. Since we run some graphics here we need a least a decent performance. I found the Fujitsu Futro S920, launched around 2013 with the AMD G-Serie GX-415GA 1.5Ghz Quad-Core CPU, 4GB RAM DDR3 and AMD Radeon™ HD 8330E as graphics adapter, which even supports OpenGL 4.1. All for Euro 29,- inclusive the power adapter. Energy consumption around 10 Watts. Replace the 2GB mSATA drive against a 16 or 32GB for another Euro 20,-.

One could argue, why not using a Raspberry PI ? With a proper casing and power adapter I would reach almost Euro 100,-.

Fujitsu FUTRO S920

Linux OS

Debian based OS are my choice. Using the Lubuntu distro we use a small memory footprint and decent diskspace requirements.

Screen and Stand

For the screen I sourced 40″ screens, grade B returns for roughly Euro 100,-, another way to keep this project sustainable by giving electronic equipment a second life. Now comes the handicraft challenge, building the TV stand. I prefer a portrait setup, a professional stand is easily Euro 200,-. Some iron square tubes, basic welding knowledge and some paint do the job. Material spent per stand about Euro 40,-.

This could even backup as super-low budget FIDS screen setup.

I managed to build the whole setup for less than Euro 200,-. Now time to get it ready for public display.

Final Setup (on display piece’sandstorm’ transformed version by the author, original by Sayama, CC BY-NC-SA 3.0)

A small desktop version made from scrap metall for a 22″ screen

In the upcoming part 2 I will talk about the software setup of the installation as well share some insights about processing.

Stay tuned..

Bookshelf: AI 2041

Another recommended book for the Holiday break. I came across this title listening to the Nvidia Podcast (which I also highly recommend). How will artificial intelligence change the world over the next two decades ? In 10 stories Kai-Fu Lee explores the future with a blend of science and fiction, making it more accessible to non-tech readers. CO-authored by Chen Qiufan who created the fictious parts. The book was only released in last September (not yet available in German language). Every chapter brings up complex AI topics and hotly debated issues, ranfing from Deep Learning, VR, Self-Driving Cars to Quantum Computing. The non-fiction review of AI concepts analyses and describes how technology works. It reminds me reading books of Isaac Asimov 30 years back.

If you have read ealier books of Kai-Fu, like ‘AI Superpowers: China, Silicon Valley, and the New World Order’ or ‘My Journey into AI..’, this is my recommendation for you.

Get your copy from your favourite book dealer or online. Check out the book website here.

#RetroTech; 80s Home Computer again

I have fond memories of my first steps into computing in the 1980s, when home computing took living and study rooms by storm. For the first time, computing became widely accessible and affordable for everyone. I have only one original device at hand, so we will explore alternative retro options to go down the memory lane and also visit some of the other home computing platforms. The retro craze goes through various technology trends, people start to value music played by HIFI LP player and pictures taken by analog photography equipment again, others collect old computing equipment and video games consoles. The market reacts to this demand and you can re-buy the old technology again (usually packing emulators on modern chipsets into the old casings), like Sony was relaunching the PS1, Nintendo the NES or Atari the 2600 console. Prices for authentic old equipment are raising too (recommended NY Times article). In this post we will have a look at the Commodore C64.

Relaunched Commodore C64 in original case

First things first, you do not need to buy any equipment for a brief visit to the home computing past, all can be done in the browser or with emulation tools on any regular notebook or Raspberry Pi. The Commodore C64, my first own computer in 1984, I sold 1991 to finance my first IBM comp. PC. But with all the nostalgic memories attached to it, I bought a retro set from Retro Games Ltd., for roughly Euro 100,- (see above image), just for the sake of the physical look and feel of it (Note, no Commodore logo or trademark used, which was sold and passed on multiple times until today). You could achieve the same by installing RetroPie, which can almost any home computing and game console of the 80s and 90s.

The Sinclair ZX81

Before looking at the C64, a quick look at the Sinclair ZX81, which I temporary used (borrowed from a schoolmate) for about a year and to do my first computing explorations. This device was released in 1981 by Sinclair Research, a very basic device coming with 1KB (!!!) memory, a Z80 CPU at 3.25Mhz, running Sinclair Basic and supporting only a 24 x 36 character monochrome screen resolution (using a regular TV set). Everything-included-in-the-box and the user input was nothing but a pressure-sensitive membrane keyboard. An absolute nightmare for any serious typing, not to say development, but it was the only thing at hand.

Image by Evan-Amos – CC BY-SA 3.0

It did support an external add-on 64KB memory adapter, a cashier-style small printer and the only way to load and store programs was on regular audio tapes at 250bps. If you are keen to give it a spin, drop by this website.

3D Monster Maze by Malcolm Evans in 1981

There was no way to compile applications, so all the commercial tools and games came automatically as open source.

ZX81 Basic Source

The Commodore C64

The famous blue launch screen and the command to start the first app on the disk

The Commodore 64 (aka C64, CBM 64) was definitely THE home computing device of the 1980s. By far the biggest number sold compared to similar devices in the market.

Several extensions and additional hardware made the device quite universal, even allowing non-gaming activities like text processing.

A few software Highlights

Microsoft Multiplan

Believe it or not, the great-grandfather of Excel was released in 1982 by Microsoft itself. Very painstaking to use, absolutely the worst possible UX.

Multiplan on the C64
Wikipedia: Multiplan
Data Becker

Once famous German publisher Data Becker had a series of office applications like Textomat, Datamat and other xyz-mat.

Source: c-64.online.com

Infamous also their books about any C64-related content, like programming and applications of all kind.

Cover der 3. überarbeiteten Auflage 1985
Source: c64-wiki.de
GeOS Commodore C64

Launched in 1986 (One year after Microsoft introduced Windows 1.0) Berkeley Softworks released GEOS (Graphic Environment Operating System). Don’t forget, this is a graphical OS on a 1Mhz 64kB 6502 processor! I specifically bought a mouse to use it. Fun facts: Nokia used it for their Communicator Series before switching to EPOC. Plus, the source code was reverse-engineered and made publicly available on Github.

GEOS for the Commodore 64
Wikipedia> GeOS
Sublogic Flight Simulator II

Anyone remembers the Flight Simulator 1 by Sublogic released in 1979 ? State-of-the-art at that time, looking at the hardware inside an Apple IIe, but a terrible flying experience in a wireframe landscape,.

Wikipedia: FS1 Flight Simulator

The sequel Flight Simulator II came with major improvements, colors and real-world sceneries. What a quantum leap that kept me flying for hours. Dont forget to look the glasses of someone living in the 80’s, if you compare this to the latest MS Flight Simulator, it looks like a joke.

Wikipedia: Flight Simulator II (Sublogic)
Wikipedia: Flight Simulator II (Sublogic)

Other Home Computing Devices from the 80s

Many other home computing devices tried to conquer homes in the 80’s, most of them not even remotely as successful as Commodore.

Amstrad CPC 464, with CTM644 colour monitor
Wikipedia: Amstrad CPC
ZXSpectrum48k.jpg
Wikipedia: Sinclair Spectrum
Atari 1040STf.jpg
Wikipedia: Atari ST
Apple IIe.jpg
Wikipedia: Apple IIe

Conclusion

There is quite some excitement about old technology, mostly for sentimental reasons. It allows us to have a little time travel trip in the past. Sadly to say it won’t keep you entertained very long, the memories feel better than experiencing it again.

#RetroTech; The ZIP Drive

Another tech memorabilia from the 1990’s hidden away in a box for 25 year to be recovered during the attic exploration, the infamous iomega zip drive 100.

Iomega 100 ZIP Drive

This was certainly a smart innovation in the early 90’s when the predominant (transportable) media was the 3.5″ disk with 1.4MB. Iomega came up with this removable 100MB storage device using a similar form factor like a disk, but offering 70 times more disk space. Take note, at that time the average hard disk space was around 500MB, so 100MB were a decent backup option. The drive was not cheap with a price around U$ 200,- and single disks roughly at U$ 20,-. Various types were offered, supporting IDE, SCSI, USB, Firewire connections. Still the device was not as scuccessful as expected, it had to compete with the (writable) CD-ROM and CD-RW, it faded away in the early 2000’s. Iomega does not exist any longer, the company was acquired by EMC in 2008.

The above device was recognized by Windows 10 and the 20 years old backup files could still be read.

Some other similar devices were introduced during the same decade, all eventually disappeared: Jaz Drive, EZ 135 Drive, Super Disk and a few more. All sharing the same faith and leaving you in trouble if you trusted them for long term archive purpose.

Usual office desk sight with storage boxes for disks.

This is a common theme and “retro” problem that we look at here, starting in the last episode with the 3.5″ disk, a few more similar cases I will discuss in upcoming posts. We are now roughly 35 years into main stream office and home computing and we already facing challenges to persist data more than a few years.
Book-printing was invented by Gutenberg in the 15th century, there are still books around from the medival times and we still access the data, aka. read the text. The comparison can be challenged, not feasible to store today’s data volume on paper.
Fun fact: There are some tools and libraries that support creating paper-based backups, though volume-limited, this backup will survive dooms-day and any EMP, as long the paper does not catch fire and is laminated to protect against humidity. Give paperback a try, it even supports key encryption.

Main problems with old storage media and types:

  • File Format
    The format certain type of data is stored on any medium (no matter if magnetic tape or BlueRay or cloud storage) might not be supported any longer after a few years because the format is e.g. propietary or outdated, like the MS Access 2.0 format from the last post.
  • Storage Media Type
    Propietary devices from decades ago to read the respective media, are not built any longer, not supported by current OS or just do not function any more.
  • Media Preservation
    Depending on the media type, magnetic, optical, flash-memory (semi-conductor), the data can survive a more or less long time before it starts to degrade and become corrrupted or unreadable.

Stay tuned for more retro tech explorations..

#RetroTech; Rewind 35 years with the 3.5″ disk

A recent visit to our attic during the xmas break revealed a number of technology artefacts from the past. Holding these items in hands you will realize how long you already have been working in IT. Let me share some of the findings with you, like these installer disks (3.5″) sitting in a box for almost 20+ years. Surprisingly the majority of these disks, kept in a dry box, still can be read without problem.

You noticed when 3.5″ disks faded away ? At some point the drives were no longer built-in notebooks (same already happend to CD/DVD-ROM drives today) and eventually disappeared from desktop PC’s too, maybe with the end of the Windows 95 start-disk. In the 1980’s the 3.5″ disk was launched as replacement of the infamous 5.25″ floppy disk. While the initial SD version (early 80’s) only offered 360kB, we could store 1.4MB with the HD version towards 1990. Take note, a 3min MP3 file is roughly 4 MB in size. It was the main media to store and transport any kind of data. Only by 2010 Sony stopped producing them, now in 2021 the disk is extinct.

Some of the above highlights:

  • MS DOS 5 and 6: Release 5, first version supporting 3.5″ disks, released in 1991. The same year I bought my very first (own) IBM compatible PC. Release 6 came in 1993 and eventually 6.22 was the last official release in 1994. (Wikipedia link)
  • MS Windows 95
    Released in 1995, it merged DOS and Windows 3.1 into one OS. The first 9x release with the distinct Windows look that persists until today. Slowly stepping into the 32bit era, unfortunately it was not really stable, crashed frequently and slowed down over time (my most prominent memories at least). I remember the plug’n-play feature which was not so plug’n-play as proclaimed and spending endless hours finding and fiddling with obscure drivers for hardware. (Wikipedia link)
    That’s 25 years ago, you remember the commercials with the Rolling Stones song “Start me up” and the “Where do you want to go today ?” slogan ? Fun fact: Bill Gates paid something like 14 million dollars to Rolling Stones.
  • MS Visual C++
    You notice there is no release number ? Right this is the initial (“visual”) release 1.0 in 1993 running under 16bit Windows 3.0. My first steps with this programming language. I remember how troublesome it was to create even basic looking application gui’s. (Wikipedia link)
  • SUSE Linux 7.2
    Five years after the initial release 4.2, the version came out in 2001. The first Linux I installed on my own PC, until then I used Linux solely at University and work.
  • 3D Pool by Aardvark
    This 1989 game came with my first PC set, a 3D pool simulation. Quite amazing 3D rendering on a 256kB PC with a simple S3 VGA adapter supporting 16/256 colors. Experience it here.

Using this USB disk drive I was able to retrieve my digital sourcecode memories. You get these drives for about Euro 30,- . If you look for a 5.25″ solution you have to ressort to the used stuff on the usual selling platforms, plus you require a desktop that still supports IDE.

USB 3.5″ disk drive
Nerve-racking transfer speed

It took only a few disks to stumble upon a time traveller, the AntiCMOS.A virus from 1994. Survived on the disk for 25 years and being kicked out by Windows 10.

Some sourcecode retrieved from old disks, like these memories of Z80 assembler code. Can you be any closer to the CPU than this ?

Z80 Sourcecode

Extract of a Turbo Pascal application that manipulated the graphics card directly using Assembler.
Supposedly there was a way to brick or burn the 1992 graphics hardware with a combination of specific direct calls, I remember vivid discussions with the head of the IT institute at my university fearing I would damage something. Today I think that was a tech myth.
I came across Pascal first time in the mid 1980’s at high school in the IT class equipped with Apple IIe and Apple Pascal. Btw, Pascal is 50 years old in 2020 !

Pascal Code

I remember my very first PC system, a 80386 SX running at 20Mhz, 256kB RAM, equipped with 5.25″ and 3.5″ disk drives, plus a whopping 20MB harddisk, which I thought would provide enough space for many years to come.. I spent DM 2,500.-, today’s equivalent of roughly Euro 2,300.- for this set, inclusive of a 14″ CRT color screen and a Star LC 24-10 dot-matrix printer.

You fancy to run the old systems ? Let’s go, we have a few options at hand.

  1. Original Hardware
    Provided you are willing to spend money on old harware and find an old IBM comp. PC (like a 80386) on Ebay, plus all the installation disks, this is truely the retro nerd way. You going to experience 1990’s first hand with all the slowness and swapping disks, failing stuff, etc. I skip that one.
  2. Virtual Box
    If you still own the original disks (like I do in this case), you can spin up a DOS guest session in Oracle’s VirtualBox and install everything from the scratch. Much faster than option 1, but still a little bit more nostalgia than option 3 and 4.
Windows 3.11
Windows 95

3. DOS Emulator
Save the time creating a virtual PC and install a native emulator on your windows environment. Try DOSBox.

4. Online Emulator
As usual, there is an emulator for everything now, any you can spin up an old piece of hardware in your favourite browser without touching a screwdriver or a disk. Drop by the PCjs website and explore all kind of OS and software from the past with the click of a button.

Conclusion

Fun nostalgia experience exploring the roots of software and hardware we use today. I learned a lot during this barebone hands-on times back then, valuable when looking at today’s IT environment where you are layers and layers away from hardware and the basic understanding how things work under the hood.

There are times you need to spin up these emulator or old OS, when you come across files that are no longer supported by modern OS and software releases. I had to install MS Access 97 in order to read old Access 2.0 databases.

MS Access 97 Installer

Stay tuned for more retro tech exploration..

Podcasts on AI and Data Science

The interest in Artifical Intelligence has exploded over the last few years with hardware and software increasing performance massively, the same time we have data in abundance to work with. Deep Learning is certainly the number one looked for topic in Computer Science. Anyone can do ML/DL now at home, the whole field is both democratized and made accessible. With a regular laptop you can get started easily with a selection of online and local tools/resources, and a huge choice of data at hand (e.g. Kaggle and other datasources), you can scale to process larger datasets either by having more RAM and a GPU installed in your machine, or use paid resources from AWS, Google, Microsoft and others.

The learning curve is steep, many online courses and books are available, maybe too many to choose from. Beyond that, how to stay up-to-date or get more insights ? The good old Podcast (20 years since the term was coined) is a welcome alternative to reading. You can listen during you commute (who is commuting nowadays ?) or any other physical activities. Though I find it a bit hard to get complex technical stuff (algorithm) explained without any visual context but there are still many topics to be covered, ranging for legal and ethical aspects to interviews with practitioners in various fields and many more topics. It is impossible to follow all podcasts out there, but you can subscribe to a few and hand-select the episodes that are of interest or relevance for you.
Here a list of podcasts that I follow which I like to highlight, updated over time. Focus on podcasts that are produced in English language and actively maintained. (Last Update 2020-12-05)

Lex Fridman

Lex is a researcher at MIT, working on autonomous driving, human-robot interaction and all kind of machine learning topics. He appears as quite an introvert character allways wearing a black suite, speaking very calmly without fuzz and excitement but transporting lots of insights to his audience. His interviews cover topics from machine learning, mathematics, philosophy, ethics, astrophysics to plasma physics.

Since 2018 he produced more than 140 episodes of his podcast and it is amazing to listen to the high profile people he invites from the academic world in interviews between 60 to 90 minutes in length. Among his guests were Alex Filippenko, Michio Kaku, Andrew Ng, Ian Hutchison, Kai-Fu Lee, James Gosling, Richard Karp, Elon Musk and many more.

I also recommend to watch his presentation “Deep Learning State of the Art (2020)” from the MIT Deep Learning Series and the accompanying website deeplearning.mit.edu

Episodes: 140+ since August 2018

Podcast Website: lexfridman.com/podcast (all the episodes also available on YouTube)

In Machines We Trust

Running since Summer 2020 host Jennifer Strong and the MIT Technology Review team discuss the more ethical side of machine learning. I highly recommend the episodes about application face recognition and its implication for society.

Episodes: 15+ since July 2020

Podcast Website: forms.technologyreview.com/in-machines-we-trust

Eye On A.I.

Former New York Times correspondent Craig S. Smith runs the audience through a very divers of AI related topics by interviewing various experts.

Episodes: 61+ since October 2018

Podcast Website: www.eye-on.ai/podcast-archive

Practical AI: Machine Learning & Data Science

Chris Benson and Daniel Whitenack are discussing real usecases, datasets and setups of AI exploration. Unlike many other interview type podcasts this is rather hands-on.

Episodes: 115+ since July 2018

Podcast Website: changelog.com/practicalai

The TWIML AI Podcast

In this very actively maintained podcast with new episodes every few days, Sam Charrington is talking to various AI researchers, data scientists, engineers and tech-savvy business and IT leaders.

Episodes: 449+ since May 2016

Podcast Website: twimlai.com

The AI Podcast

This podcast is operated by NVIDIA, the biggest player in the GPU hardware manufacturer games, runs talks and interviews with leading experts in the field.

Episodes: 129+ since November 2016

Podcast Website: blogs.nvidia.com/ai-podcast

AI with AI

The podcast moderated by Andy Ilachinski and David Broyles from the Center for Autonomy and Artificial Intelligence, a group inside the CNA (Center for Naval Analyses, research for U.S. Navy and Marine Corps), discusses latest development in the field. The topics are sometimes related to military use of AI, but recent episodes also look into Covid-related topics.

Episodes: 15+ since July 2020

Podcast Website: www.cna.org/news/AI-Podcast

Photo by Austin Distel on Unsplash

Raspberry Pi goes Desktop

The Raspberry Pi Foundation just launched the Raspberry Pi 400, the regular ARM-based SPC with 4GB memory, a bunch of external ports and wireless connectivity, but all packaged into a neat keyboard casing. First time walking away from the pure tinkering setup to a just-hook-it-up-to-a-screen-to-get-started set. You can choose to buy the “keyboard” alone at Euro 70,- or get an Euro 100,- set with power-plug, hdmi-wire, keyboard and pre-loaded micro-SD card that gets you up and running in less than 15min (incl. auto downloading and installing the latest Raspberry Pi OS). The form factor reminds me remotely of the Sinclair ZX Spectrum and ZX 81 in the 80’s.

Raspberry PI 400 in the box
Raspberry PI 400 (all built-in)
Raspberry PI 400 (back)

The Raspberry Pi 400 has not changed the specs from the last 4B release as of 2019. The SPC comes with the ARM quad-core Cortex-A72 CPU at 1.5 Ghz, 1/2 or 4 GB of memory (now also 8GB version available), wireless LAN, Bluetooth 5.0, BLE, 4 USB ports (2.0 and 3.0), 2 micro HDMI ports supporting 4Kp60. Plus the favourite 40pin GPIO connection for prototyping and accessing the huge tinkering space, to get started into anything between from home-automation to controlling robots or weather stations.

Is it a Desktop PC or Windows Notebook killer ? Depends, maybe not. With the Raspberry PI OS, a debian-based operating system, you have access to a lot of applications for daily use. You know Ubuntu, you know Raspberry PI OS !

Raspberry PI 400 running a 4K Screen

It won’t work for corporate environements living in the Microsoft product world, though you could use the web version of Office365. It is also not the right choice if you want to do linear video editing or run fancy resource hungry DX12 (Windows again) games. But it works perfect for surfing the web, Youtube, etc. Most of the average usage is web-based anyway and with this device we can keep energy consumption as low as 5 Watt (plus screen).

A remark on 4K: Out of the box (without overclocking and proper heat sink/ventilation) you will get only a stuttering experience, the regular HD 720 works fine, even Full HD is still acceptable.

Raspberry PI setup as we know it

Conclusion: A great idea gets re-packaged, to make it accessible to a wider audience. This would be a great platform for our education system. Instead of fixing kids to Microsoft Products or spending hundreds of Euro’s to Apple iPads’s as part of the so-called digitalisation roadmap you would better embark on such an open platform. Have you attached a temperature sensor to an iPad as part of the physics curriculum ?

The only thing missing is a Raspberry PI coming as tablet or notebook.

HTTP Cookie Warfare

You won’t visit any web page today without having cookies being involved, literally making you leaving a trail of crumbs for all kinds of third parties to track your whereabouts and activities in the web. Cookies get a lot of attention, you are constantly creating and updating them by accepting or consenting to the privacy and cookie usage terms on many websites, but most internet users don’t really know what cookies are or how they work. They were created back in 1994 by Lou Montulli, working at Netscape, with a legitimate reason, storing a file in your local browser storage as a reference to inform a server if the user has visited the site previously. It was patented in 1995:
US5774670A Persistent client state in a hypertext transfer protocol based client-server system

Original Drawing U.S. Patent 5,774,670 Page 6

About Cookies

Cookies are served by either the website you visit (First-Party-Cookie) or as a Third-Party-Cookie of a service embedded into the website you visit, e.g. ad companies. The browser creates a local cookie file with some unique ID to check on the server side during the next visit or visit to another site using the same cookie. The primary purpose is session management, personalization and tracking. Technically, it is a text file with key-value combinations, in the modern browser it is stored in a database, e.g. Firefox uses a SQLite DB. Unlike common understanding there is no encrypted information and there is no personal information such as your name or similar. The power is the creation of a digital fingerprints by combining with other information, e.g. the IP address, the agent-string send by the browser and information about other sites visited to perform profiling of the user.

Table Stucture for cookies in Firefox

You like to observe the creation of cookies and their content when opening a website? Start the developer tools of Firefox or Chrome first. I randomly choose cnn.com, you can do with any commercial website.

Firefox Developer – Cookies View

Take note of cnn.com placing a cookie before you consent.

Earlier, if you want to protect yourself from third party tracking, you had to install additional add-ons for your browser, now Firefox becomes smarter and comes with an onboard protection. If you like to see the blocked cookies, disable the feature. I recommend doing this in a private window session (which deletes all cookies after closing).

Firefox protection
Blocked Tracker Cookies

Cookies and Privacy Consent Pop-Ups

Hidden Options

Very common to all websites, they try to keep you away from not consenting. The ACCEPT button is very prominent but there is no I DO NOT ACCEPT, the options are alway hidden behind a link with different label. They rely on our laziness to go to an extra page to disable the cookies.

Ebay Consent Pop-Up

eBay makes you accept by pressing the button Accept or clicking on any item on the website. To disable cookies you have to go to More Information and scroll to the end to confirm by pressing Continue. At least all cookies are disabled in this screen by default.

Some other samples:

ZDnet Pop-Up
cnn.com pop-up

A sample for proper implementation: One click to reject all or limit cookies.

Overwhelming Number of Players

This marketing landscape behind the scenes can be breathtaking, let’s look at the website wired.co.uk.

Have you ever clicked on the List of Partners (vendors) ?
There are not less than 500 companies listed, each one comes with its own privacy policy.

The different Strategies

Today cookies (or the pop-ups) become an annoyance, it disturbs any user experience in the web because the cookie consent pop-up is first representative of a company or service you will when visiting a site. Sometimes followed by the pop-up asking if it can alert you for any news or a bot assistant asking to give (not so) smart answers. Let’s have a look at the different ways of obstructing content with consent pop-ups.

The Obfuscator Entrance Website

The first thing you see is nothing but the pop-up over a blurred background. You literally can’t read a line without consenting to all or going through the options.

Engadget.com by Verizon (the company that bought Yahoo)
The ‘Not-so-obfuscated-but-no-control” Websites

Same as the previous type, a prominent pop-up, but you can see the landing page content, though you cannot click anything.

The ‘There-are-no-Options” Websites

You land on the page, you can access all links and pages, but you cannot opt-out of anything. There is a permanent display of the pop-up until you finally Accept.

TechNewsWorld pop-up
theverge.com pop-up

Tools

The different browser offer different add-ons to manage cookies or look at their content. One is Cookiebro, there are many similar ones.

Conclusion

With the current laws and regulations (GDPR, 2009/136/EC) all of these samples are in line with legislation (to be proved).
Basically, you cannot escape completely from cookies. If you disable them completely you won’t be able to read your (web-)emails, manage your shopping-cart and do other essential functions. We can rely to some extent to Firefox to block the worse tracking cookies and we can wipe out all cookies after closing the browser, which requires you to enter passwords every time you visit the same site (or use the password manager in Firefox).

Germany’s Covid-19 Tracing App

Finally the contact tracing up is here. With big media attention the app was launched to public 2 days back, after the bumpy start with a complete strategic change from centralized to decentralized model. SAP and Deutsche Telekom cooperated and created an app-server environment in less than 2 months, chapeau for that performance. It comes at 20 million Euro, plus 2 to 3 million monthly for Deutsche Telekom, mostly to operate the hotline (not sure if they actual build an office building first, but anyway). Some might claim a start-up could have done it for a fraction, yes, maybe, would it be ready ? Would it be able to provide a system that can talk to the public healthcare system to create TAN’s to push alerts ?

You can download for Google and Apple. Interestingly the app was downloaded more than 1 million time within 2 days, after 3 days supposedly 8 million times and got more than 30.000 ratings/reviews (Android only) in that short timeframe (mostly positive) !
The team seem to work agile, they pushed out an update within 72 hours.

You can review and download the code at Github. I must admit, the documentation is really good, no one can claim this is not transaparent. Despite still so many people claim it would be used for tracking people and their whereabouts.
First thing to do ? Let’s look at the sourcecode and the underlying libraries by Apple/Google. Did not manage to build it with Android Studio 3.4, had to update to 4.0 to get it built and run on the phone, though it would fail after the initial info screens due to the missing Exposure Notification API on the phone. Guess that will be fixed once the Playstore app is updated.

Findings:

  • Permissions
    As planned and communicated only the bare minimum to make this app work. You can see it does not require the coarse location permission usually required by BLE.
  • Libraries
    A few of the standard libraries: ZXing Embedded, Joda Time, Room, SQLCipher, detekt and a some others.
    Most importantly Google’s Exposure Notifications API
  • Why would you disallow the user making screenshots, if everything is transpraent and opensource ?
  • Once you installed the apk file through Android Studio you cant install the official app, even after removing the app installed through adb.

Conclusion: Full transparency. It did materialize quite fast, too late for the first wave (no app can be ready for something totally new) but certainly in place if a second wave or individual hot-spots appear. I still have doubts if we reach a 60% penetration nut the initial figures are looking good. You can still argue about the quality of measuring the signal strength of Bluetooth and how many warnings we will see.