How many programming languages do you know? How many shell commands do you know? How many markup formats do you know? How many frameworks do you know?
Is there a point where the shear complexity of modern computing outweighs its benefits?
When I first got into computers in the 90s things were much simpler. Most end user programming was done in BASIC. Maybe some Pascal or Assembler. And that was about it. Shell commands? Well that was built into BASIC.
Hardware was simpler too. The Apple II 6502 CPU had around 50 instructions. 50! If you were happy to learn…
I was first exposed to computer programming some time in the late 1980s early 1990s. Initially it was probably 10 PRINT “Hello”; 20 GOTO 10 on a Commodore 64. I actually didn’t have a computer until late 1993. So I learned to program mostly from books borrowed from the library. I progressed to Pascal which I enjoyed and also learnt about microprocessors. I would write Pascal and assembly language programs on paper with no idea if they would compile or run.
In that era computers were still rudimentary. For example, my first computer was a Macintosh Classic II which had…
There has been some concern recently that there is no guarantee that miners will store the data uploaded to the BSV blockchain despite the large fees paid to miners. nChain have made it clear that the ‘market’ will solve the problem. This effectively moves the solution for data storage off chain and could lead to not requiring large data to be stored on the chain at all. In this article I explore the implications of a ‘market’-based approach for the three Bitcoins.
There’s been some discussion and concern recently around BSV data permanence. One of BSV’s main features is the ability to store large data in transactions. The appeal is that once the transaction is paid for it will be in the blockchain forever. However, it is becoming apparent that miners may not necessarily keep the data as it isn’t required for them to mine blocks.
In this article I describe some of the issues and perspectives surrounding this issue and propose a potentially simple change to the mining algorithm which could solve the problem.
V1 of Codeonchain provided a way to upload files to the metanet via a command line interface with a Github-like web interface. V2 of Codeonchain is a complete rewrite of the website where all of the functionality of the command line tool has been moved to the web, plus additional features. At this stage the command line tool has not been updated for V2. This article covers the main features of Codeonchain V2 as well as some of the under-the-hood details.
Client-Server development is hard.
If we look at the evolution of software development from the beginning to around the 1990s, software development generally became easier and end-users were increasingly able to develop advanced software. The pinnacle was development environments like Apple’s HyperCard and Microsoft’s Visual Basic.
This trend came to a screeching halt with the advent of the web. The web originally was a server-side architecture where all of the logic existed on the server. The user experience was much poorer than client-side-only applications. Web 2.0 began to address this by moving more logic to the client. The result was…
Throughout history the curator has performed an important role in making the best content available for human consumption.
A newspaper or magazine editor will select the most suitable writers, decide on topics for editions, edit articles, and even write their own commentary. Editors have a highly esteemed role.
Growing up in the 80s and 90s it would largely be the radio DJ which would determine what music we listen to. The radio would be how we would discover good music. If the music isn’t good we would switch off. …
When I was a teenager in the early 1990s I bought my first computer. It was a Macintosh Classic II which ran at 16MHz. A little over a year later I upgraded to Mac with a 25MHz processor. About a year later I upgraded to a PowerMac 6100 running at 60MHz. Two years later I bought a PowerMac G3 running at 233MHz. These were all desktop machines. Two years later I was using a PowerBook G3 laptop running at 400MHz. Four years later I bought a PowerBook G4 laptop running at 667MHz.
Each of the above upgrades represented about a…
Sony and Phillips introduced a new CD format called Super Audio CD (SACD) in 1999. It took a radically different approach to CDs (although was backwards compatible with an optional CD layer). Despite very strong marketing efforts to bring SACDs to the main stream it never really took off and seemed to appeal mainly to audiophiles. Since the launch of SACDs we now have high res PCM downloads as well as higher resolution DSD downloads.
In this article I want to look at some of the arguments for and against DSD as a format and some of the issues that…