A collection of thoughts, code and snippets.

Network with Tailscale

Posted on Monday, 13 January 2025

I updated my OpenVPN based network to use Tailscale instead in 2023 and it is game changing. I have used Tailscale ever since. I simply did not update my blog and network diagram.

Network

With Tailscale, all my machines appear seamlessly on a single control pane and I can reach any of them from any device.

Zone Z

Z has a single fiber connection via Swisscom to internet.

Inventory

  • DeepThread is an AMD Threadripper 1920x running Windows 10.
  • Minis4 is the Beelink MiniS12 N95s running Ubuntu Server 24.10.
  • NAS is an an older QNAP TS-269L

Zone N

N has two connections, a Starlink (v1 round) with only the powerbrick router and Sosh as a backup DSL provider (with an ADSL Router) both connected to a Ubiquity UDM-PRO-SE in Failover mode.

Inventory

  • Minis1 is the Beelink MiniS12 N95s running Windows 11, enjoying it VESA mounted behind a screen in the office currently. I originally thought I would also put Ubuntu, but a windows machine is useful.
  • Minis2 and Minis3 are the Beelink MiniS12 N95s running Ubuntu Server 24.10. Currently rackmounted with the UDM-PRO.

VPN

On the UDM-PRO, a VPN is configured with Ubiquity and I can use the iOS application WifiMan to access the network. It's really a backup of a backup solution to have Wifiman.

On Minis2 and minis4, a cloudflared docker is running, reaching up to Cloudflare and providing an Zero trust tunnel to expose several dockerized websites hosted on it.

I made a Suno song on how awesome it is.


On the Horizon - 2025 - My predictions

Posted on Monday, 13 January 2025

On the Horizon - 2025 - My predictions

In a way, predicting 2025 is somewhat harder and easier than 2024, a lot of what I see are the seeds of 2024 coming to bloom. But for what we will have by the end of the year is really unclear to me - but we will see some impact in research for sure - AI assisted research in fields will explode this year.

Predictions:

AI Video / AI Audio

  • Cinema level visuals made on generative environments - not only creating a video, but creating the entire space so that the camera can be then moved in post production - will become available to the high end customers. This approach is a complement to the diffusion models, which only generate a few frames of temporal consistency - using this method will allow much better time coherence and consistency / object permanence. Nvidia Cosmos is closest to this and I think the next version of it will satisfy this point. I expect many video models to actually start using this method with temporal control nets to avoid the inconsistency of object permanence.
  • AI Chatbots will be allowed to sing, make music and emote more. While some LLMs are already capable of such things, they are generally removed in post-training but I think these restrictions will be removed this year. Suno's lead on AI Music gets folded into a leading model, meaning you'll be able to ask a ChatGPT competitor "make me a song, with lyrics and background track".
  • Visual understanding models will be commonly deployed - Meaning point your camera and get full descriptions of what you see, It's nearly there anyway.

AI Agents

By agent, I define as a application in which an AI model takes actions against external systems on behalf of a user in furtherance of a user's give task and goal.

  • A desktop based agent will become available to use on the computer. Interacting with your browser and mail client automatically. (Probably Claude's Anthropic will be there soonest) - doesn't mean the LLM has to run on the desktop.
  • AI Agents included in softwares (Teams, Github, ...) will start to become available in preview at least before the end of the year.
  • Programming Agents will start to be useful (see AI Devin in 2024 being still completely unusable) - but in 2025 AI coders will be the focus and ship mid year.

AI in Mathematics

  • While the first theorem proof by an LLM has already been published, I expect 2025 to have a slew of progress on fundamental proofs rewritten by LLMs or LRMs, particularly towards automated proofing systems (Coq, ...) and several proofs generated by LLMs that humans did not independently derive.

AI in Medicine

  • One company will announce an AI diagnostics companion for health - that is certified as a support tool for doctors.

AI in War

  • A semi-autonomous multi-agent AI will be used to control a tactical engagement in Ukraine. This has nearly happened due to the engagement of the first robot brigade (dec 2024 : Ukraine’s All-Robot Assault Force Just Won Its First Battle ) - but without AI - using tele-operation. I think AI will be used at more levels than just terminal guidance of the FPV drones.

AGI

  • One company claims their model has attained AGI - defined as a model that is as good as a reasoning human - in office work related tasks. There is a lot of disagreement if this constitutes AGI or not.

Robotics:

  • Humanoid robots become available in limited quantities to the general public, with at least 1 company shipping a home robot with AI onboard to do simple tasks (Unitree from 2024 does not count since the robots are only remote controlled). These first robots will be sometimes teleoperated for specific tasks. Pricing will be lower than 50k$ per robot.
  • One company announces wide scale drone deliveries in US cities: While drone delivery companies already exist (13 Drone Delivery Companies to Know | Built In), one of them is going to break out as an early leader this year.

Space:

  • SpaceX demonstrates first ship to ship refueling.
  • Blue Origin gets to orbit with New Glenn and proves the landing system, but is not able to send a reused first stage to orbit yet.
  • The first part of a new space station gets deployed, probably commercial.

Environment:

  • 2025 beats 2024 as hottest year.

Energy:

  • Record installation of Solar, Wind and renewables (beating 2024 worldwide - despite the US's drop).
  • Price spikes on Gas and Petrol due to US actions and disruption of Russian production. Unsure if that will continue throughout the year but we will have shocks from policy changes.

Disposable software

Posted on Monday, 9 December 2024

The era of throw away software is upon us

With the advent of LLMs and their capability to create quick programs ("create me a flashcard app", "I need a typing exercise software", "make a dashboard to track my investments"), we might see a lot more software being written, used and then discarded since it's trivial for a LLM to re-write it next time it is needed.

Where perfection is not required, just good enough, there will be a whole slew of applications, websites and programs that are used and put into production that are never even reviewed by a human programmer, just tested for their outputs and/or visually checked by a human. Even, we might see lots of code that is only ever read by a machine for bugs and issues, to then be corrected by a machine.


Using an audit trail table on Oracle

Posted on Saturday, 5 October 2024

Implementing Auditable Updates in a Relational Database

In modern applications, maintaining an audit trail of changes to data is crucial for compliance, debugging, and data integrity. This blog post explores a straightforward approach to implementing auditable updates in a relational database system, specifically focusing on a project management scenario with hierarchical data.

Problem Description

We have a relational database containing Projects, each of which includes Instruments, Markets, and Valuations. These entities form a tree structure, adhering to the third normal form (3NF). Previously, any update to a project involved downloading the entire project tree, making changes, and uploading a new project under a new ID to ensure complete auditability.

This approach is inefficient for small updates and doesn't allow for granular tracking of changes. The goal is to enable small, precise updates to projects while maintaining a comprehensive audit trail of all changes.

Solution Overview

We introduce an audit table that records every change made to the database. The audit table will store serialized JSON representations of operations like update, insert, and delete. We'll also provide C# code to apply and revert these changes, effectively creating an undo stack.

Let's use the following DB Schema for illustration:

TableSchema

  • Primary Keys: Each table has a primary key (e.g., ProjectID, InstrumentID).
  • Foreign Keys: Child tables reference their parent via foreign keys (e.g., instruments.ProjectID references projects.ProjectID).
  • Audit Table: The change_audit table records changes with fields like ChangeAuditID, TimeApplied, and ImpactJson.

Implementing Change Auditing

The change_audit table is designed to store all changes in a JSON format for flexibility and ease of storage.

CREATE TABLE change_audit (
  ChangeAuditID   NUMBER PRIMARY KEY,
  TimeApplied     TIMESTAMP,
  UserID          VARCHAR2(100),
  ImpactJson      CLOB
);

JSON Structure for Changes

Each change is recorded as a JSON object:

{
  "Operation": "update",
  "impact": [
    {
      "Table": "Instruments",
      "PrimaryKey": {"ProjectID": 4, "InstrumentID": 2},
      "Column": "Name",
      "OldValue": "Old Instrument Name",
      "NewValue": "Updated Instrument Name"
    }
  ]
}

CSharp to apply changes given an operation

To apply changes recorded in the JSON, we'll use C# code that parses the JSON and executes the corresponding SQL commands.

I assume you have the _connectionString available somewhere as a constant in the code.

using Oracle.ManagedDataAccess.Client;
using Newtonsoft.Json.Linq;
using System;
using System.Collections.Generic;

public class ChangeApplier
{

    public void ApplyChanges(string jsonInput)
    {
        // Parse the JSON input
        var operation = JObject.Parse(jsonInput);
        string opType = operation["Operation"].ToString();
        var impactList = (JArray)operation["impact"];

        using (var conn = new OracleConnection(_connectionString))
        {
            conn.Open();
            using (var transaction = conn.BeginTransaction())
            {
                try
                {
                    foreach (var impact in impactList)
                    {
                        string table = impact["Table"].ToString();
                        var primaryKey = (JObject)impact["PrimaryKey"];
                        string column = impact["Column"]?.ToString();
                        string newValue = impact["NewValue"]?.ToString();

                        switch (opType)
                        {
                            case "update":
                                ApplyUpdate(conn, table, primaryKey, column, newValue);
                                break;
                            case "insert":
                                ApplyInsert(conn, table, impact);
                                break;
                            case "delete":
                                ApplyDelete(conn, table, primaryKey);
                                break;
                        }
                    }

                    transaction.Commit();
                }
                catch (Exception ex)
                {
                    transaction.Rollback();
                    Console.WriteLine($"Error applying changes: {ex.Message}");
                }
            }
        }
    }

    private void ApplyUpdate(OracleConnection conn, string table, JObject primaryKey, string column, string newValue)
    {
        var pkConditions = BuildPrimaryKeyCondition(primaryKey);
        var query = $"UPDATE {table} SET {column} = :newValue WHERE {pkConditions}";

        using (var cmd = new OracleCommand(query, conn))
        {
            cmd.Parameters.Add(new OracleParameter("newValue", newValue));
            cmd.ExecuteNonQuery();
        }
    }

    private void ApplyInsert(OracleConnection conn, string table, JToken impact)
    {
        var primaryKey = (JObject)impact["PrimaryKey"];
        var newValues = (JObject)impact["NewValues"];
        var columns = new List<string>();
        var values = new List<string>();

        foreach (var property in primaryKey.Properties())
        {
            columns.Add(property.Name);
            values.Add($":{property.Name}");
        }

        foreach (var property in newValues.Properties())
        {
            columns.Add(property.Name);
            values.Add($":{property.Name}");
        }

        var query = $"INSERT INTO {table} ({string.Join(", ", columns)}) VALUES ({string.Join(", ", values)})";

        using (var cmd = new OracleCommand(query, conn))
        {
            foreach (var property in primaryKey.Properties())
            {
                cmd.Parameters.Add(new OracleParameter(property.Name, property.Value.ToString()));
            }

            foreach (var property in newValues.Properties())
            {
                cmd.Parameters.Add(new OracleParameter(property.Name, property.Value.ToString()));
            }

            cmd.ExecuteNonQuery();
        }
    }

    private void ApplyDelete(OracleConnection conn, string table, JObject primaryKey)
    {
        var pkConditions = BuildPrimaryKeyCondition(primaryKey);
        var query = $"DELETE FROM {table} WHERE {pkConditions}";

        using (var cmd = new OracleCommand(query, conn))
        {
            cmd.ExecuteNonQuery();
        }
    }

    private string BuildPrimaryKeyCondition(JObject primaryKey)
    {
        var conditions = new List<string>();
        foreach (var prop in primaryKey.Properties())
        {
            conditions.Add($"{prop.Name} = :{prop.Name}");
        }
        return string.Join(" AND ", conditions);
    }
}
  • ApplyChanges: Parses the JSON input and determines the operation type.
  • ApplyUpdate: Executes an UPDATE SQL command using parameters to prevent SQL injection.
  • ApplyInsert: Executes an INSERT SQL command, constructing columns and values from the JSON.
  • ApplyDelete: Executes a DELETE SQL command based on the primary key. BuildPrimaryKeyCondition: Constructs the WHERE clause for SQL commands.

A side note, for the insert, you'll have the challenge if you are using auto-incremented IDs, this will mean you don't know the new IDs until you have inserted the data, so you should make sure to capture the new IDs and then create the audit log. This is left as a simple exercise to the reader in case it is necessary.

CSharp to revert changes

To revert changes (undo operations), we'll process the audit trail in reverse order. Here I give the processing of a list of operations as an example of unrolling. It is to note that the reverse delete does only one table, so if there was some connected information that was deleted via referential identity, it was the task of the audit table to keep that in the audit.

public class ChangeReverter
{
    public void RevertChanges(List<string> jsonOperations)
    {
        using (var conn = new OracleConnection(_connectionString))
        {
            conn.Open();
            using (var transaction = conn.BeginTransaction())
            {
                try
                {
                    jsonOperations.Reverse(); // note: you could also have provided sorted by last time from the audit table instead of reversing them

                    foreach (var operationJson in jsonOperations)
                    {
                        var operation = JObject.Parse(operationJson);
                        string opType = operation["Operation"].ToString();
                        var impactList = (JArray)operation["impact"];

                        foreach (var impact in impactList)
                        {
                            string table = impact["Table"].ToString();
                            var primaryKey = (JObject)impact["PrimaryKey"];
                            string column = impact["Column"]?.ToString();
                            string oldValue = impact["OldValue"]?.ToString();

                            switch (opType)
                            {
                                case "update":
                                    RevertUpdate(conn, table, primaryKey, column, oldValue);
                                    break;
                                case "insert":
                                    ApplyDelete(conn, table, primaryKey);
                                    break;
                                case "delete":
                                    RevertDelete(conn, table, impact);
                                    break;
                            }
                        }
                    }

                    transaction.Commit();
                }
                catch (Exception ex)
                {
                    transaction.Rollback();
                    Console.WriteLine($"Error reverting changes: {ex.Message}");
                }
            }
        }
    }

    private void RevertUpdate(OracleConnection conn, string table, JObject primaryKey, string column, string oldValue)
    {
        var pkConditions = BuildPrimaryKeyCondition(primaryKey);
        var query = $"UPDATE {table} SET {column} = :oldValue WHERE {pkConditions}";

        using (var cmd = new OracleCommand(query, conn))
        {
            cmd.Parameters.Add(new OracleParameter("oldValue", oldValue));
            cmd.ExecuteNonQuery();
        }
    }

    private void RevertDelete(OracleConnection conn, string table, JToken impact)
    {
        var primaryKey = (JObject)impact["PrimaryKey"];
        var oldValues = (JObject)impact["OldValues"];
        var columns = new List<string>();
        var values = new List<string>();

        foreach (var property in primaryKey.Properties())
        {
            columns.Add(property.Name);
            values.Add($":{property.Name}");
        }

        foreach (var property in oldValues.Properties())
        {
            columns.Add(property.Name);
            values.Add($":{property.Name}");
        }

        var query = $"INSERT INTO {table} ({string.Join(", ", columns)}) VALUES ({string.Join(", ", values)})";

        using (var cmd = new OracleCommand(query, conn))
        {
            foreach (var property in primaryKey.Properties())
            {
                cmd.Parameters.Add(new OracleParameter(property.Name, property.Value.ToString()));
            }

            foreach (var property in oldValues.Properties())
            {
                cmd.Parameters.Add(new OracleParameter(property.Name, property.Value.ToString()));
            }

            cmd.ExecuteNonQuery();
        }
    }

    // Reuse BuildPrimaryKeyCondition and ApplyDelete methods from ChangeApplier
}
  • RevertChanges: Processes the list of JSON operations in reverse order to undo changes.
  • RevertUpdate: Sets the column back to its old value.
  • RevertDelete: Re-inserts a deleted row using the old values stored in the audit trail.
  • ApplyDelete: Deletes a row, used here to undo an insert operation.

JSON schema

The reason that I prefer to use the Json directly in the C# code is that actually making up the C# classes for this schema is actually more work that processing the json directly in the code.

{
  "$schema": "http://json-schema.org/draft-07/schema#",
  "title": "ImpactJsonRoot",
  "type": "object",
  "properties": {
    "Operation": {
      "type": "string",
      "enum": ["update", "insert", "delete"],
      "description": "Type of operation"
    },
    "Impact": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "Table": {
            "type": "string",
            "description": "Name of the table affected"
          },
          "PrimaryKey": {
            "type": "object",
            "description": "Primary key fields and their values",
            "additionalProperties": {
              "type": ["number", "null"]
            }
          },
          "Column": {
            "type": "string",
            "description": "Column affected (for updates)"
          },
          "OldValue": {
            "type": ["string", "number", "boolean", "null"],
            "description": "Previous value (for updates and deletes)"
          },
          "NewValue": {
            "type": ["string", "number", "boolean", "null"],
            "description": "New value (for updates and inserts)"
          },
          "OldValues": {
            "type": "object",
            "description": "All old values (for deletes)",
            "additionalProperties": {
              "type": ["string", "number", "boolean", "null"]
            }
          },
          "NewValues": {
            "type": "object",
            "description": "All new values (for inserts)",
            "additionalProperties": {
              "type": ["string", "number", "boolean", "null"]
            }
          }
        },
        "required": ["Table", "PrimaryKey"]
      }
    }
  },
  "required": ["Operation", "Impact"]
}

and here are examples of operations:

update

{
  "Operation": "update",
  "Impact": [
    {
      "Table": "Instruments",
      "PrimaryKey": { "ProjectID": 4, "InstrumentID": 2 },
      "Column": "Name",
      "OldValue": "Old Instrument Name",
      "NewValue": "Updated Instrument Name"
    }
  ]
}

insert

{
  "Operation": "insert",
  "Impact": [
    {
      "Table": "Instruments",
      "PrimaryKey": { "ProjectID": 4, "InstrumentID": 10 },
      "NewValues": {
        "Name": "New Instrument",
        "Type": "Flexible Asset",
        "LastUpdated": "2024-10-05T12:34:56Z"
      }
    }
  ]
}

delete

{
  "Operation": "delete",
  "Impact": [
    {
      "Table": "Instruments",
      "PrimaryKey": { "ProjectID": 4, "InstrumentID": 5 },
      "OldValues": {
        "Name": "Obsolete Instrument",
        "Type": "Flexible Asset",
        "LastUpdated": "2024-10-01T09:15:00Z"
      }
    }
  ]
}

Note: OpenAI's o1-preview was used to assist in the creation of the post.