I was first exposed to computer programming some time in the late 1980s early 1990s. Initially it was probably 10 PRINT “Hello”; 20 GOTO 10 on a Commodore 64. I actually didn’t have a computer until late 1993. So I learned to program mostly from books borrowed from the library. I progressed to Pascal which I enjoyed and also learnt about microprocessors. I would write Pascal and assembly language programs on paper with no idea if they would compile or run.

In that era computers were still rudimentary. For example, my first computer was a Macintosh Classic II which had…

There has been some concern recently that there is no guarantee that miners will store the data uploaded to the BSV blockchain despite the large fees paid to miners. nChain have made it clear that the ‘market’ will solve the problem. This effectively moves the solution for data storage off chain and could lead to not requiring large data to be stored on the chain at all. In this article I explore the implications of a ‘market’-based approach for the three Bitcoins.

1. BSV

BSV is well known for its large data storage and it’s the primary use-case of BSV at the…

There’s been some discussion and concern recently around BSV data permanence. One of BSV’s main features is the ability to store large data in transactions. The appeal is that once the transaction is paid for it will be in the blockchain forever. However, it is becoming apparent that miners may not necessarily keep the data as it isn’t required for them to mine blocks.

In this article I describe some of the issues and perspectives surrounding this issue and propose a potentially simple change to the mining algorithm which could solve the problem.

  1. Non-permanence of the current data storage approach

V1 of Codeonchain provided a way to upload files to the metanet via a command line interface with a Github-like web interface. V2 of Codeonchain is a complete rewrite of the website where all of the functionality of the command line tool has been moved to the web, plus additional features. At this stage the command line tool has not been updated for V2. This article covers the main features of Codeonchain V2 as well as some of the under-the-hood details.


  • Create new repositories
  • Upload files
  • Create folders
  • Create links to existing files/folders/transactions
  • Download folder contents as a zip
  • Storage…

Recently I have been seeing the following banner popup when I read Medium stories:

I’m seeing this lately

I don’t use Medium seriously, I’m not a paying member, and I haven’t monetised any of my posts. However, I still get about 100 reads/week:

Client-Server development is hard.

If we look at the evolution of software development from the beginning to around the 1990s, software development generally became easier and end-users were increasingly able to develop advanced software. The pinnacle was development environments like Apple’s HyperCard and Microsoft’s Visual Basic.

Developer ease of use peaked in the late 1980s and early 1990s and decreased as client-server architectures became more common.

This trend came to a screeching halt with the advent of the web. The web originally was a server-side architecture where all of the logic existed on the server. The user experience was much poorer than client-side-only applications. Web 2.0 began to address this by moving more logic to the client. The result was…

Throughout history the curator has performed an important role in making the best content available for human consumption.

A newspaper or magazine editor will select the most suitable writers, decide on topics for editions, edit articles, and even write their own commentary. Editors have a highly esteemed role.

Growing up in the 80s and 90s it would largely be the radio DJ which would determine what music we listen to. The radio would be how we would discover good music. If the music isn’t good we would switch off. …

When I was a teenager in the early 1990s I bought my first computer. It was a Macintosh Classic II which ran at 16MHz. A little over a year later I upgraded to Mac with a 25MHz processor. About a year later I upgraded to a PowerMac 6100 running at 60MHz. Two years later I bought a PowerMac G3 running at 233MHz. These were all desktop machines. Two years later I was using a PowerBook G3 laptop running at 400MHz. Four years later I bought a PowerBook G4 laptop running at 667MHz.

Each of the above upgrades represented about a…

Sony and Phillips introduced a new CD format called Super Audio CD (SACD) in 1999. It took a radically different approach to CDs (although was backwards compatible with an optional CD layer). Despite very strong marketing efforts to bring SACDs to the main stream it never really took off and seemed to appeal mainly to audiophiles. Since the launch of SACDs we now have high res PCM downloads as well as higher resolution DSD downloads.

In this article I want to look at some of the arguments for and against DSD as a format and some of the issues that…

You may not have heard of DSD (Direct Stream Digital)? DSD is the encoding format for Super Audio CD (SACD) which Sony and Philips released in the early 2000s. It had some success, but was launched just as music downloads were taking off, and SACD was a locked physical format which didn’t support downloads.

In 2011 some engineers got together and defined the DSD over PCM (DoP) specification which packed DSD into standard PCM packets so that the computer could transmit DSD to a DAC. DAC manufacturers quickly supported it, and today (2016) many DACs support DSD. …


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store