couteausuis.se thoughts

a note about notes

intro

even since i started using computers, my organisation has been a mess. i’m lucky enough that my memory is good and can compensate my lack of organisation, but as i’m getting older, i’m trying to be mindfull of leaving documentation traces, notes, etc. thus began my search for the perfect notes organisation tool.

requirements

to map my brain model, the tool should have everything i will ever need for the rest of my life, a one-stop-shop of notes and other contents. remove silos and lower friction like copy-pasting in multiple apps, remembering different keybindings, quirks, different design trends.

conceptualy, i’m more used to a hierarchical (or top-down) approach, but i really like the network (or bottom-up) approach. i guess the ideal is in the middle, where i would organise hierarchicaly (by projects or types) while having an automatic keyword network.

feature-wise, the tool should offer no friction while being used simultaneously as :

  • quick notes or scratch pad
  • todos
  • daily stoic journal
  • organized notes
  • project management
  • publishable knowledge base (Notion)
  • programming notebook (jupyter)

blocks or notes should be freely moveable between usages, ex. a quick note can be a moved into a project task, or into a notebook, etc.

continuing the thought process

if we push a bit more the reflexion:

  • most emails i receive leads to tasks or notes to take -> they want X information, i received Y information regarding a project.
  • calendar events are mostly results from tasks or things to not forget -> meeting regarding project Z (agenda includes notes A,B,C and results in notes ZA, ZB, ZC).
  • “take a dentist appointment” is a calendar event that would be best suited as a note in the daily journal.
  • while programming, i often need to test snippets of code, or document it for a knowledge base.
  • code bugs often leads to tasks in a project.
  • code and research time if often billed or noted in the daily journal.
  • while managing a project, i often have to deal with other types of files -> those files could be linked to notes and stored appropriately.
  • paper files needs organizing too and often leads to tasks or notes.

thus, it would make sense to replace the function of a file browser, calendar and mailbox, or imbed them in a certain way (to not reinvent the wheel).

It would also make sense to be able to execute python code and sql queries directly in the notebook (via linked files and snippets), with code versioning and project scaffolding.

other considerations

the tool should abstract away the filesystem, be it local, remote, cloud, services or a multiple of those while not moving files or blocking other apps from using them (ex. VSCode for bigger projects). it should easily allow exportations and migrations.

for exemple, one should be able to link a paperless-ngx instance to catalog and use files stored there in various notes.

the tool should be pretty but unobstrutive and allow customization, via themes and plug-in, offer vim/emacs/native keybindings and a web browser extension.

most importantly, the tool should offer native clients (macOS/iOS), with a self-hosted open-source server.

contenders

in my lengthly search over the years, i tried multiple apps that seek to acheive some of those requirements:

  • Notion (closed source and not self-hosted)
  • Outline (geared only towards KB/team)
  • Logseq (too much a network of notes)
  • Standard Notes (still too much silo)
  • AnyType (not self-hosted or customizable)
  • BookStack (geared only towards a wiki)
  • Joplin (too much hierarchical)

but with limited time on this earth, i cannot settle for an incomplete tool to organize my whole life.

next steps

i know most of the requirements could be acheived with emacs and org-mode (even email and calendar).

i’ve used those tools before and unfortunately, the commands/actions take too much brain-space for me. i need the tool to be graphic first, with key combos that i can learn at my own pace.

stay tuned for Citrus Notebook…

migrate Apple Photos to Immich

intro

tired of having to pay for cloud storage (basically forever), i began the journey to migrate my Apple Photos Library to a self-hosted Immich server. here’s the steps i took.

exporting for Photos.app

the first task is to download all the originals using the macOS Photos.app. Simply check the ‘Download Originals to the Mac’ in the iCloud preferences. Let the app download every asset, you can watch the progress in you Library.

Then, we’ll be using osxphotos to do the actual export, as the Export feature in Photos.app is too restrictive. pro tip: do the initial upload over LAN, it will be much faster and you will not hit upload size restrictions. osxphotos has lots and lots of options and it can seem very daunding at first, but here’s a simple command to do the job:

osxphotos export yourFolder --skip-original-if-edited --sidecar XMP --touch-file --directory “{folder_album}” --download-missing

you can change the options to your liking, here’s the breakdown:

  • –skip-original-if-edited : will export the edited files only, and the originals if there’s no edited version.
  • –sidecar XMP : generate a corresponding XMP sidecar file for every assets, that contains more metadata (faces, edits, etc).
  • –touch-file : retains the original file date instead of the exported date.
  • –directory “{folder_album}” : create folders based on the albums (they will be used to import to Immich after).
  • –download-missing : even tho we downloaded originals in the Photos app, some assets might still need downloading.

Once the files are exported, we’ll upload them to the Immich instance.

upload to Immich

once you have your immich server running, bulk uploading is pretty straightforward. following the docs for the immich-cli tool, i’m using this command to upload the files:

immich upload --key YourAPIKey --server yourLANIP --recursive yourFolder --album

  • –recursive instruct the tool to upload the assets from every subfolders.
  • –album will create immich albums from the parent folder of the assets. don’t worry about duplicates as immich makes a good job to de-dupe the files and the albums.

next steps

you can now delete all assets in your iCloud Photos Library and free up costly cloud space. the Immich ios app does a great job to sync new assets to the server.

living on the bitcoin standard

intro

with bitcoin taking more and more space in my life with every new projects, i began to think of living on the bitcoin standard, or making life changes to place bitcoin at the very center. i encourage you to try the same experiment and see how feasible it is in your current situation as well as the different risks and benefits it can bring.

living on the bitcoin standard is like having a short position against the system, using your whole wealth and in every transactions.

lets analyse the idea by reversing the maslow pyramid:

bitcoin to store my wealth

i’m very lucky to have some wealth accumulated through the years. but how can i protect it from the inflation-theft and keep it’s purchasing power for future use?

Since i’m quite young, i can stomach pretty high volatility in the short term if i’m confident there’s a high probability of having a great return on a long timeframe.

bitcoin as the perfect case for this, because of its deflationary issuance policy. no other financial product (in my mind) comes close regarding its colossal possible return, of which there’s a very high probability given a long enough timeframe.

Following that chain of thought, i transfered all my RRSP savings into a RRSP trading account and bought BTCC.B shares. In other words, my retirement fund consist of shares of a bitcoin pool. It’s not my keys, it’s still an IOU-paper-bitcoin, but it allows me to accumulate wealth faster.

Parallel to this, i’ve been accumulating (lump-sum and DCA) bitcoin in cold storage since 2013. I think of my stack as a savings account.

Even with thoses life-changing moves, i still feel like i’m not doing enough to accumulate and propagate bitcoin.

Onto the next phase!

bitcoin as a pay stub

In the last 10 years, it’s gotten very easy to buy bitcoin:

  • Shakepay allows you to very easily setup reccuring buys and dollar-cost-average.
  • BullBitcoin allows you to buy KYC-Free at every Canada Post location, and other options with minimal KYC
  • Hardware wallets like Coldcard are getting easier to use

But still, DCA and lump-sum buying of bitcoin requires you to hold dollars in your account. Dollars that I don’t want.

So where do these dollars come from? My pay stub.

Let’s get paid in bitcoin directly (bi-weekly), and reverse the whole transfer process, selling only once a month, to pay expenses!

First of, the only real option here is with BullBitcoin

Second, how much am i screwing up here? What is the risk profile for the past year? for the past 4 years? Basicaly, what is the biggest drawback (in $CAD) I can get on a 2-4 weeks bitcoin holding period. This graph shows the model I created, were I compare 2 options of bitcoin saving:

  • in Blue, buying bitcoin at the end of every months, using my budget leftover
  • in Red, having a by-weekly pay converted to bitcoin and selling only what I need to cover the monthly expenses at the end of the month.
  • in Green, the difference at the end of the month between the two methods

without much surprise, the biggest drawback happened in the bear market, where it costs me more (in bitcoin) to pay my expenses than what i bought for the same $CAD value. During the bull market, I only need to sell a smaller percentage of my bitcoin stack to cover expenses, thus netting a good profit.

Third, how much more bitcoin am I accumulating ? This graph shows the same comparison as the previous one, but calculates the accumulated wealth in btc instead of in $CAD. Same conslusion, in the bear market it’s very costly but during the bull phase I can accumulate faster.

The ideal scenario, it seems, would be to get paid in bitcoin during the bull run and revert back to monthly DCA during the bear market.

bitcoin to pay bills

Nice, now i’m getting paid in bitcoin! But wait, I still have bills to pay:

  • appartement/mortgage
  • inssurance
  • electricity
  • internet
  • credit card

Let’s see if some of those utilities want to take straight bitcoin?

Nope. I’ll work on that

In the meantime, i’ll use Bylls from BullBitcoin to pay my bills using bitcoin Now, all the leftover bitcoin from a month will go straight to my cold storage

bitcoin as a currency

Let’s push the idea further and try to live by spending only bitcoin. Now this might seem counter-intuitive, since spending dollars and accumulating bitcoin is the perfect short position, but increasing bitcoin’s usage and merchants that accept it is a net positive.

I’ll call every merchant from the pasts months and ask if they accept bitcoin. If not, i’ll try to orange-pill them.

bitcoin farm architecture

intro

i’ve been involved in multiple bitcoin mining farms since 2013, currently operating a small, 150 machines farm. i find it’s a nice way to stay up to date with the latest bitcoin developpment, as it involves hardware and software upgrades a few times per year. it’s also a cheap way to obtain bitcoin (cost of production is less than market price) and a nice place to test new ideas (heating, immersion mining, etc).

like when i started in 2013, i’m still offering my services as a consultant for all your bitcoin mining needs.

bitcoin mining

if you have no idea what is mining bitcoin, i invite you to read on it.

bitcoin mining helps us to acquire bitcoin at a much cheaper rate than the market price and similar to the DCA method. you need a small investment upfront for the setup and the machines (you can buy used), and then you only have to pay for electricity monthly, while producing and stacking sats everyday. of course, mining is extremely inefficent and creates a lot of (wasted) heat, but this is also a pretty nice by-product. using that heat, we can heat up homes during the winter or even heat up some greenhouses, drasticaly (~75%) reducing their energy costs. now if you stack your sats (instead of selling at market price) and wait ~4 years, you’ll even turn a profit on that heat enterprise! for further reading, look at the Arcane article.

since bitcoin mining is inefficent and uses a lot of electricity, farm operators are always looking and the price per kilowatt ($/kw) to help their margin. operators have moved accros continents to save on energy prices. this also pushes the use of renewables energy sources, as they are often the cheapest option. bitcoin mining can also help to offset the high capital expenditure needed to launch a renewable energy project. when you’re building a solar panel farm or hydro dam, you’re building it for the expected energy needs in 10,20,30 years and so it will be oversized at the start. the few starting customer will be paying much higher in the first years and in developping countries it’s not even feasable as customers can only afford the price if the project is fully operational. by feeding the surplus energy to bitcoin mining farms, the whole project can turn a profit quickly and greatly reduce costs for the customers, making renewable energy project cheaper and more accesible.

machines

we started the project in 2013, adding machines whenever we can. we are very lucky that our aging S9 are still profitable. we do have to do a lot of maintenance of the aging machines. out friends ad D-Central have helped us a lot in sourcing and repairing hashingboards and psus. the current farm has quite a funky mix of machines:

  • 10 wattsminer
  • 30 antminer X19
  • 40 antminer X17
  • 70 antminer S9

monitoring

my current farm is located quite remotely from where i live. Cheap electricity and distance from neighboors were the drive factors to decide it’s location. now since it’s far, i don’t want to have to get there often, that’s why lots of thoughts need to be put into the monitoring and remote access. a good remote monitoring solution should cover these points:

  • secure remote access
  • monitoring and configuring of individual machines
  • alert system
  • logs / dashboards
  • hassle free

secure remote access

i’m using a simple raspberrypi located on the farm site, running stock Raspbian OS. using tailscale as a VPN enables me to connect easily to the farm (much easier and faster than an OpenVPN or WireGuard tunnel) and since the mining site network is a specific LAN subnet (10.66.10.0/24), i can broadcast it from the raspberrypi (effectively turning it into a VPN router) and browse the whole farm and machines from the comfort of my home. tailscale also links every machines under my account, meaning that i can use my homelab (pomelo) or even a VPS to monitor and log the farm. to install tailscale with routing and broadcasting:

curl -fsSL https://tailscale.com/install.sh | sh
echo 'net.ipv4.ip_forward = 1' | sudo tee -a /etc/sysctl.conf
echo 'net.ipv6.conf.all.forwarding = 1' | sudo tee -a /etc/sysctl.conf
sudo sysctl -p /etc/sysctl.conf
sudo tailscale up --advertise-routes=10.66.10.0/24
#activate route in tailscale web admin

pro tip: since i’m using tailscale MagicDNS to use my pihole dns from every device, all dns queries from the raspberrypi will go through my pihole. because i’ll use the raspberrypi as a stratum mining proxy, i don’t want to add this layer of latency and/or breaking point. adding this line at the end of /etc/dhcpd.conf will force it to use cloudflare dns server : static domain_name_servers=1.1.1.1

monitoring and configuring of individual machines

on every machine, i’m installing Braiins OS+ and the next logical step is to use Braiins’ own FarmProxy to reduce bandwidth usage and latency. with FarmProxy, we can aggregate hashpower and split it between partners/customers whitout having to reconfigure single machines. using Braiins OS+ also enables us to monitor the individual machines (power, temperature, hashingboard) since they each have prometheus endpoints. Braiins OS+ can also be configured using btctools, so we can push configurations in batch and easily scan the network when adding new machines or replacing old ones. to install FarmProxy on the raspberrypi:

#install docker & docker-compose
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker ${USER}
sudo apt-get install libffi-dev libssl-dev
sudo apt install python3-dev
sudo apt-get install -y python3 python3-pip
sudo pip3 install docker-compose
sudo systemctl enable Docker
#reboot

#install FarmProxy
sudo apt update
sudo apt install git
git clone https://github.com/braiins/farm-proxy.git
docker-compose up -d

pro tip: before starting FarmProxy, i removed the prometheus and grafana docker services, since i’l be launching those from my homelab and i want to keep the raspberrypi lightweight.

logs, dashboards and alert system

on my homelab (pomelo), i’ve installed btctools, bos-farm-monitor and bos-toolbox:

#install snap and btctools
sudo snap install core btctools

#install FarmProxy Monitoring only
git clone https://github.com/braiins/bos-farm-monitor.git
#Change list of IP address ranges to scan in ./scan_crontab
docker-compose up -d
#localhost:3000

i also made a bunch of scripts to do various batch operations on the machines. I’m using this setup to update the firmware, under/overclock, change pool, locate faulty machines, etc. first, create a file with each machine’s IP address (one line each), then create a script similar to these. (updating T17 machines to the latest Braiins OS+):

#updating T17 machines to the lastest Braiins OS+ firmware
for IP_ADDRESS in $(cat "ListOfUpdate.csv"); do
	ssh -o HostKeyAlgorithms=+ssh-rsa 
        -o PubkeyAcceptedKeyTypes=+ssh-rsa 
    root@$IP_ADDRESS 'wget -O /tmp/firmware.tar firmware_url
        && sysupgrade /tmp/firmware.tar'
done

#locate machine by maing it flash it's LED lights
for IP_ADDRESS in $(cat "ListOfFailes.csv"); do
    ssh root@$IP_ADDRESS 'miner fault_light on'
done

archive: Asteroid Labs

Following Bitcoin and Cryptocurrencies since 2013, when I helped to kickstart the Bitcoin Embassy in Montréal, I am now building a revolutionary project in this field. Started from the lack of adequate tools and scatered services, we are porting Financial tools to cryptos, porting multiple projects under one banner with single login and drasticaly reducing fees. Read more on asteroidlabs.io.

See also : Meteor Trader : automatic trading using a custom-baked indicator

bitcoin and the future

what is bitcoin

bitcoin is a form of digital currency, where every transaction, once confirmed in the blockchain, are irreversible. Its ease of use, high security, merchants and platforms support and high market capitalization makes it the most popular asset to transact digitally. It is not anonymous by any means, however, the identity of the user behind the address can be unknown.

There is a fixed supply of 21M bitcoins, and countless lost during the early days of the currency. The coins are issued to miners for every block that gets confirmed on the blockchain. The issuance policy follows a rule dictating that roughly every 4 years, the number of bitcoins issued to miners is cut in half (halving).

while bitcoin is a software program, all the actors have incentives to stick to the same working code, increasing the confidence in the network. Code changes need to be approved by different crowds each with different usages & incentives (developers, nodes operators, pools, miners, users), making it very difficult to rebalance the power to one group.

digital scarity / Stock-to-Flow

more and more merchants are accepting bitcoin every day, but most of its volume is based on price speculation, as it’s true potential makes it critically undervalued.

“As a thought experiment, imagine there was a base metal as scarce as gold but with the following properties: boring grey in colour, not a good conductor of electricity, not particularly strong [..], not useful for any practical or ornamental purpose .. and one special, magical property: can be transported over a communication channel” — Nakamoto [2]

This magical property has tremendous value, and multiple models are trying to calculate bitcoin’s true price. The S2F/S2FX model is the one that makes most sense to me.

Nick Szabo has an interesting take on ‘scarity’ and defines it has ‘unforgeable costliness’, or since it cost so much money to create a real one, it’s creation cannot be easily faked. This can be easily applied to bitcoin, as it cost a lot of electricity to produce new bitcoin (PoW), producing bitcoin can’t be easily faked.

In terms of stock-to-flow ratios, often used for scarce metals and consumable commodities, scarity can be explained as follows:

“For any consumable commodity [..] doubling of output will dwarf any existing stockpiles, bringing the price crashing down and hurting the holders. For gold, a price spike that causes a doubling of annual production will be insignificant, increasing stockpiles by 3% rather than 1.5%.”

“It is this consistently low rate of supply of gold that is the fundamental reason it has maintained its monetary role throughout human history.”

“The high stock-to-flow ratio of gold makes it the commodity with the lowest price elasticity of supply.”

“The existing stockpiles of Bitcoin in 2017 were around 25 times larger than the new coins produced in 2017. This is still less than half of the ratio for gold, but around the year 2022, Bitcoin’s stock-to-flow ratio will overtake that of gold” — Ammous[5]

In the stock-to-flow model, stock represents the size of the existing reserves, whereas flow is the yearly production. Gold as an SF of 62, meaning that it takes 62 years of production to get as much gold as the current gold reserves.

Since it has been proven that scarity, but mainly stock-to-flow values is directly linked to price movements, we can safely state that a rise in scarity will result in a rise in price.

Furthermore, since bitcoin follows the same scarity law as precious metals at every halving, its price should rise tremendously about every 4 years (with a few months of lag).

Follow the stock-to-flow model applied to bitcoin price movements here.

institutions buy-in

in the 11 years since its inception, a lot of big players have watched on the sidelines to better understand how bitcoin behaves. Every week, big institutions (Grayscale, Microstrategy, Fidelity), VC funds and billionaires (Paul Tudor Jones,) are buying huge quantities of bitcoin. Cumulatively, they buy much more than the daily production.

more information here

This is not yet priced-in. as Hemingway said, “Gradualy, then suddenly”

the inevitable future

What the multiple market crash of fiat has proven to us is that the current monetary supply and financial system is deeply flawed and not fixable.

Since 2008, most governments have adopted an infinite monetary supply, or Quantitative Easing, leading to an infinite inflation. 20% of all US dollars were printed in 2020 alone. This directly translates to a diminishing value of every US dollars by 20%.

All around the world, debts are growing at never seen before rates, while governments apply the same monetary policies and have no intentions of reducing debts.

If you look at the purchasing power of the US dollar since 1971, 1000$ represents a purchasing power of only 150$ in 2020.

With bitcoin and its inherent scarity, 1 bitcoin will always be equal to 1 bitcoin in 50 years.

Bitcoin is a hedge against the diminishing value of fiat currencies and a much better store of value for your wealth in this uncertain future.

the cost of a sell

Selling an asset early will appear like a loss in the future.

Let’s look at our friend Steve, who’s in this situation:

  • Bought 1 bitcoin at 10K$
  • bitcoin is now trading at 20K$ (2x)
  • He firmly believes that a bitcoin can reach 200K$ (20x) in the next few years

If Steve sells 0.25btc now, he’ll make a 2.5K$ profit (2x ROI). In a few years, once a bitcoin is worth 200K$, his 0.25btc he sold would have been worth 50K$. Selling early will result in a loss of 47.5K$.

counter arguments

bitcoin has a few counter arguments, and i’ll to explain the most valid of them here.

bitcoin has no intrinsic value

While it’s true that it’s hard to calculate its intrinsic value, it does not mean that there is none. bitcoin is a completely new asset/tool/currency/, etc., and the humanity are only beginning to uncover its potential. Almost all paradigm shift had no intrinsic values at their launch: the car, internet, social networks.

bitcoin is unstable and based on speculation

In the past few years, every market was illiquid, resulting in a lot of price fluctuations. The fact that not a lot of people were trading it meant that price could move by a lot in a few minutes. Nowadays, we have big institutions, OTC trading, billions in liquidity. The fact is, price has been pretty stable in the past 2 years. Transaction time was also unaffected.

bitcoin uses (waste) too much energy

Has I explained earlier, each bitcoin is a proof a work done. That work needed tremendous amount of energy to be executed. It is also the key concept of bitcoin, its value proposal and what makes it irreversible. In other words, this can’t be fixed. However, technological improvements can lead to improved efficiency of the miners. Furthermore, since miners are paying for this electricity, they are in search of the cheapest electricity source, which happens to be Solar, the cleanest of them all. Even now, with a volume and transaction value per second never seen before, the whole bitcoin network uses less than the banking system it will replace.

macOS wallpapers

when i started to use a computer with Mac OS X 10.2 “Jaguar”, my main focus was to gather loads and loads of desktop pictures that meant something to me, or that were awe inspiring.

with every major upgrade of my system, new colorful and fascinating pictures were added to my collection.

lately, i’ve been delving into the nostalgia of re-exploring the wallpapers of my childhood, the ones that came with every new version of Mac OS X.

fueled by a similar endeavor by 512pixels, who upscaled some pictures to 5K resolution.

while I really liked the idea, i’m not a fan of upscaling and prefer proper archiving.

i began to gather every Mac OS X installer, from 10.0 Cheetah to 11 Big Sur, and extract the default desktop pictures folder, which you can download here :

- since 10.7 Lion, there is no more a separate “server” installer, only an app that gives the same functionalities.

then, i compiled every images (and removed duplicates) to create a giant folder of every pictures ever used as a wallpaper by a default installation of Mac OS X : Download the complete combined set

- the Promo Shots were omitted

pomelo

Build

When one of my previous workplace closed, I was gifted a complete server setup inside a blank ATX-sized tower.

For a few years it was used as a cryptocurrency mining rig, but since it was not profitable anymore, I began to think of how to repurpose it.

At the same time, I was very frustrated by multiple things in my life :

  • my files were scattered between 4-5 different external HDD. Keeping them in sync was painful.
  • Using Backblaze as a cloud backup meant I had to keep those HDD plugged in often.
  • Time Machine backups on a single HDD plugged-in an Airport Extreme was pretty slow with our 5 macs.
  • Kodi is very ugly and a pain to use.
  • my rapsberrypi was getting overwhelmed with docker containers.
  • I wanted to take back control of my data

While this was on the back of my mind, the pandemic hit, and I suddenly had much more free time to start working on the obvious project of building a complete home server.

In terms of raw performance, the current setup is as follows :

  • 12-core Intel i7-5820k @ 3.30GHz
  • 8 x 8 Gb DDR4 RAM
  • 256Gb M.2 SSD
  • 9 sata ports for 3.5” disks
  • Ubuntu Desktop 20.04

Being on a tight budget, I took the time to plan my transition. I would reduce my current backups, juggle data between disks while moving the disks to the server and then moving that back to it. The whole process took me about 2 weeks. I have upgraded the disks in all data pods since, I essentially wasted 2 weeks there…

To prevent bit rot and further secure my data, I decided to use ZFS for my data pods :

  • 3 x 6tb in raidz1, 12tb available for critical data, photo & video backups, time machine, etc.
  • 2 x 8tb in mirror, 8tb available for tv series.
  • 2 x 4tb in mirror, 4tb available for bitcoin node, ps3 isos, torrents, all around scratch disk
  • 2 x 12tb in mirror, 12tb available for films and other related files.

Total of 36tb available, currently 75% full (I’m a real data hoarder).

Building the zfs array was quite straight forward and simple. It’s also very easy to health-check the pods (zfswatcher), scrub them or replace disks.

Since I used Ubuntu, I could not use Backblaze anymore, and decided to switch to Crashplan. It’s as easy to use, but it’s much, much slower to upload. Even with multiple tricks to speed up the process, I manage to backup only 25% of my files in 7 months.

Services

Quite a few services are always running on my server, all run them in detail here.

CrashPlan (Code42 GUI app) is always running with a high priority, to backup every new files.

To use the server as a NAS, accessible from our 5 macs, I’m using netatalk to create a few afp share points. I can even make one that’s recognised as a time machine destination. See instructions here.

A pihole container is running, to block all ads in our LAN and uses DNS-Over-HTTPS. I use a docker-compose inspired by this post.

A complete bitcoin core node is always running too. Still in GUI mode.

Lately, I’ve been giving my wasted cpu cycles to BOINC, in order to help scientific research.

I had a few trading project running, but they are down now since I paused my work on them.

Plex

There is a complete Plex system running with docker-compose :

There is also a deluge with vpn (ProtonVPN) container, to separate my traffic.

version: "2.1"
services:
  deluge-openvpn:
    image: binhex/arch-delugevpn
    container_name: deluge-openvpn
    cap_add: 
      - NET_ADMIN
    environment:
      - PUID=1000
      - PGID=1000
      - UMASK=002 #optional
      - VIRTUAL_HOST=deluge.couteausuis.se
      - LETSENCRYPT_HOST=deluge.couteausuis.se
      - VIRTUAL_PORT=8112
      - VPN_ENABLED=yes
      - VPN_USER=myUser
      - VPN_PASS=myPass
      - VPN_PROV=custom
      - VPN_CLIENT=openvpn
      - VPN_OPTIONS=
      - STRICT_PORT_FORWARD=yes
      - ENABLE_PRIVOXY=no
      - LAN_NETWORK=10.0.1.0/24
      - NAME_SERVERS=209.222.18.222,84.200.69.80,37.235.1.174,1.1.1.1,209.222.18.218,37.235.1.177,84.200.70.40,1.0.0.1
      #- DELUGE_DAEMON_LOG_LEVEL=info
      #- DELUGE_WEB_LOG_LEVEL=info
    networks:
      - net
    ports:
      - 8112:8112
      - 58846:58846
      - 58946:58946
    volumes:
      - /etc/localtime:/etc/localtime:ro
    restart: unless-stopped
networks:
    net:
        external: true

All of those services are running behind a reverse proxy, so I can access their home pages using my domain with https (as I use CloudFlare to manage my domain, I have access to a free ssl certificate):

version: "3.7"
services:
  reverse-proxy:
    image: "jwilder/nginx-proxy:latest"
    container_name: "reverse-proxy"
    volumes:
      - "html:/usr/share/nginx/html"
      - "dhparam:/etc/nginx/dhparam"
      - "vhost:/etc/nginx/vhost.d"
      - "certs:/etc/nginx/certs"
      - "/run/docker.sock:/tmp/docker.sock:ro"
    restart: "always"
    networks: 
      - "net"
    ports:
      - "80:80"
      - "443:443"
  letsencrypt:
    image: "jrcs/letsencrypt-nginx-proxy-companion:latest"
    container_name: "letsencrypt-helper"
    volumes:
      - "html:/usr/share/nginx/html"
      - "dhparam:/etc/nginx/dhparam"
      - "vhost:/etc/nginx/vhost.d"
      - "certs:/etc/nginx/certs"
      - "/run/docker.sock:/var/run/docker.sock:ro"
    environment:
      NGINX_PROXY_CONTAINER: "reverse-proxy"
      DEFAULT_EMAIL: "[email protected]"
    restart: "always"
    depends_on:
      - "reverse-proxy"
    networks: 
      - "net"
volumes:
  certs:
  html:
  vhost:
  dhparam:

networks:
  net:
    external: true

Other services:

  • Tor relay
  • I2p node
  • zfswatcher
  • wireguard

Finally, I monitor my server with a mix of Cockpit and NetData.

Future projects :

  • Take back control by self-hosting more services/alternatives (BTCPay server, DomainMOD, etc)
  • Add a UPS (that’s quite urgent)
  • Switch to cli for my last GUI apps, and Ubuntu Server.
  • Minecraft server
  • Build a dead man’s switch for password and information release.

archive: Beyond Innovations

Having a big sensitivity to climate change, sustainable developments and renewable energy, a friend and I started a to build recycled and recyclable domestic-sized wind turbines. We are currently building weather stations to gather data and help finance the project. Subscribe to stay tuned, at beyondinnovations.tech.

archive: lejacobroy.com

in 2011, when i was only 15 years old, i decided to build myself a website to showcase my photographs. while working on it, i began adding modules and expending features:

  • handmade front-end design
  • front & back ends build from nothing, in PHP5 and MySQL
  • when uploading a photograph, the back-end will extract it’s metadata and store it in the DB
  • fully-featured shopping cart to buy pictures through PayPal (PP integration no longer working)
  • blogging section using WordPress (no longer functional)
  • showcase section with other art projects
  • users would buy pictures and received an encrypted link by email, allowing them to download a server-generated zip file containing the picture and license.

it then began clear that programming was a bigger passion than photography! a year later, when i backed up the site, i forgot to choose a latin encoding, resulting in weird glyphs on the restored website

please, dont judge

lejacobroy.com (moved to http://lejacobroy.000webhostapp.com/)

bpg dump

In the past few weeks, i stumbled upon a vast trove of images that needed to be ordered and filled. Here’s the problem : To save bandwidth, a genius converted all those images into the BPG format.

bpg problem(s)

First off, the format & thinking behind the BPG project is fantastic. Furthermore, Fabrice Bellard is the perfect man to work on a Library like that. All around the web, you can find promising results : savings of up to 80% while retaining the same quality as a JPEG at Q80.

Now, the first problem with a new (better) format is that from now on, it adds another player to the Monopoly game (relevant XKCD). Second, even after more than 9 months has passed since its release, the bpg format is not supported by any “big” image viewer. To give you an idea, here’s the list of apps that supports it :

solution one

I reluctantly decided to convert the images back to jpg. There are more than 45k files… In my search for a (free) converter that would support directly BPG -> JPG, I found Romeolight BPGconv. Here’s the tricky part : I need a windows VM to run it, with enough space to hold the (non/)converted files. I set up a shared network folder to hold the bpg files (30gb) and another one with about 80gb free to hold the jpg files. I then loaded all the files into the batch decoder and left it ran the whole night.

The RomeoLight BPGconv suffers from a big bug where it copies every bpg file worked on into the AppData/Temp folder but doesn’t remove it afterwards. This bug leads to a full disk and the whole VM stopping after only 1h.

To circumvent that, I wrote a simple BAT script that deletes the Temp folder, and use Task Scheduler to run it every 30 minutes. It worked and was surprisingly stable for 12hrs. After 12hrs with the frankenstein computer heating my room, I extrapolated the numbers and calculated that I needed 4.5 more days for it to churn through all the images. That’s a big no.

solution two

Luckily for me, libbpg is available on Homebrew and it comes with a bpg decoder bpgdec. You can use it to convert BPG files to PNG (very long process) or to PPM, which seems faster. What the heck if PPM ? At this point I feel like I’ve fallen down a rabbit hole. PPM stands for Portable Pixmap iMage and is really 24-bit image data in a text file… very helpful. From a 350kb BPG, we get a 12mb PPM file.

To convert my fresh PPM file into something useable (that was the goal of the post) I need another userful tool, imagemagick.

I wrote a small bash script to loop through all the bpg files in a folder and convert them to jpg.

#!/bin/bash
dir=$(pwd)
for fullfilename in $dir/*.bpg; do
    [ -e "$fullfilename" ] || continue
    filename=$(basename -- "$fullfilename")
    filename="${filename%.*}"
    echo "Converting image # ${filename}"
    bpgdec -o $filename.ppm $fullfilename
    convert -quality 75 $filename.ppm out/$filename.jpg
    rm $filename.ppm
done

final thoughts

This bash script is not very parallell-friendly (are any bash scripts?), so I split my stack of 45k images into 8 folder and started 8 scripts separately (4 cores + hyperthreading). In 8hrs everything was done, and I was very happy. I guess the moral of this post is to always try to do it yourself, you’ll learn much more along the way and probably find shortcuts.

myths on electric cars

When we explore the idea of an electric car and what it implies (mostly breaking ties with the old way of doing things), we quickly stumble upon a vast group of people who devote their lives to discredit the whole concept, by doubts and fear.

Here’s the rough ideas with which they’ll try to convince you that electric cars are a conspiracy created by a pot smoking alien.

electric cars are polluting as much as an ICEV

In some States/Countries, electricity is made from big power stations, powered by an oil, gas or nuclear fuel. In those places, charging an electric car seems to only divert the fuel consumption to the power plants. This is mostly correct, except that using an electric car instead of an ICEV will still result in a much lower carbon footprint (or tiretrack…) and a much lower fuel consumption.

Here’s the catch, because the combustion engine needs to fit inside an engine bay, be light enough as to not affect the car too much and be quiet enough not to disturb the car’s occupants, it needs to compromise on something else. A car combustion engine will compromise on efficiency. A modern one will probably get up to a measly 25% efficiency, meaning that for every drop of fuel, only 25% of it is converted into actual work. Even worst, when idling, the efficiency drop close to 0%, since no work is being done (except heating and a small electrical charge).

On the other hand, power stations don’t have those kinds of limitations, they can be as big, as heavy and run as long as they want. In fact, their only limitation is to be as efficient as possible because their profitability depends on it. Modern thermal power station have an efficiency of about 60%. This makes power stations exactly 240% more efficient than combustion engine in cars. Let’s see how that translate into real-world usage : If you can travel 100 km with 10 L of petrol (10 L / 100 km) with an ICEV, the same 10L of petrol in a power plant could generate enough electricity to drive 240 km.

Of course this calculation doesn’t take into account power transmission losses, battery efficiency, and the monetary & energy cost used when refining and transporting petrol to the petrol stations. Even if we factor it in, the trend is clear : It is at least 2 times more efficient (and cleaner) to drive an electric car instead of an ICEV, even when the electricity you use to charge it comes from a dirty source. Imagine when your electricity comes from Solar, Wind or Hydro!

batteries are not recyclable and are a big pollution source

Because of all the weird-sounding names like Lithium and Ion inside the battery, people think that it’s highly pollution (during its production or afterlife) and that we can’t even recycle it.

In reality, It is true that producing a car battery involves lots of polluting processes : Extracting Lithium, Nickel and Cobalt are very hazardous and can contaminate water wells, among others. On the other side, batteries can be recycled, its chemistry can be restored or its cells can be repurposed for stationary use (like Nissan does).

charging could overload the grid

Let’s consider Canada’s case : we have around 33 million cars on the road, with an average 20 000 km per year. Knowing that an electric car consumes about 20 kWh per 100 km, if every car in Canada was an electric one, the total needed electricity would amount to 130 million kWh yearly ( 3600 kWh per capita ). Adding this to the 15.5 MWh total electricity consumption per capita will bring it to 19.1 MWh yearly per capita. Changing all cars in Canada to electric would add only 23% of the total electricity consumption. Furthermore, because most of the charging occurs on off-peak hours, we could use up the surplus we currently have, and we would not need to add as much infrastructure.

Now the killer argument is this : extracting and refining crude oil into petrol takes a lot of energy : more than 5 kWh per litre of fuel. For and ICEV to travel 100 km (at 10 L / 100 km), it will have used up more than 50 kWh, whereas an electric car will use around 20 kWh. Another 2.5x more efficient. Knowing this, if every can in Canada was electric, it would actually LOWER our total electricity consumption (because we extract and refine a lot of petrol).

anxiety

Detractors believe that electric cars are not really usable because of the long time it takes to charge them and the small range they provide.

While it’s true that charging on a Level1 charger can take a very long time (8hrs+), most of the charging happens on Level2 or Level3, with a charging time of 5hrs or 30-45mins. Next, consider that for most of the day, the car is parked and unused. Be it at your house, work, parking lot, etc. We can easily charge the car whenever it’s not in use, even if we are at 80% charge, because plugin the car takes only a few seconds, compared to filling up a ICEV tank.

Range anxiety depends greatly on the person’s way of life : most of Canadians travel less than 50 km per day, meaning that they need to charge once in 10 days for a Tesla Model 3, and 4 days for a Nissan Leaf. Not bad at all! Do not forget that we do not need to go to special charging stations, because electricity is everywhere.

final word

While it’s true that when both cars go out of the production line, the electric car if dirtier because of all the rare metals inside the battery and motor. But, as soon as the car travels 30 000 km, the total emissions are equal. After that, the electric car is greener than the ICEV one.

email non-revolution

When we look back at the last 20 years, can we find one tool that we use every day that hasn’t been “revolutionized”? Cars -> Electric cars, Appliances -> Smart Appliances, Phones -> Smartphones (can we still call those phones?), Lights -> LED, Wires&Cables -> Wireless, Computers that can display text -> Computers that can simulate the whole universe in VR (not really), Money -> Cryptocurrencies, Houses -> Tiny houses, WEB -> WEB2.0 (Graphical) -> WEB3.0 (Apps), Email -> Email. Yup, despite all the changes elsewhere, email hasn’t changed.

i beg to differ

Well, of course, it has changed, there was an evolution, a few small incremental updates. Email clients are essentially web browsers, and emails, nowadays, essentially static webpages. Because of this, emails have gotten prettier with time, following the advance of web design. Email clients have gotten smarter, with SPAM filtering, Smart mailboxes, contacts, etc. In fact, for the majority of people, their web browser is their email client. That’s right, webmail. It kinda feels full-circle-y.

except, not

At its core features, modern email is still the same old email. First, let’s consider the UI : Most, if not all, email clients display mailboxes on the left sidebar, besides a top to bottom list of email, showing the title and the sender’s address. When you write an email, you need more information than in other massaging system :

  • The recipient’s address. Some clients have contacts baked-in, some of which you can import/export contacts lists and some simply don’t have any contact system.
  • A Subject. This actually makes me angry. A Subject needs to be short, and set the tone for the email’s content. Also, it can’t be too vague. It really doesn’t need your name or the date, because the recipient already know that information from metadata. And it cannot be empty. I’m always deeply puzzled as to what to put in a title. Emails should be self-explanatory. Final rant : why put it on top, as one of the first things to fill? When you write a letter, you normally decide on the title at the end, after all your thoughts have been explained.
  • The content. Writing an email is still mostly text. Of course you can send links, images, even short videos and on some clients you can go full Microsoft-Word-Clipart-style on it. Even with all that, it’s still missing a dynamic WYSIWYG editor. Also 10Mb size limit? smartphone pictures are bigger than that.

why would i write an email?

Writing email is mostly like sending a physical mail. You write something, send it and you never know when the recipient is going to read it. On most social networks, you can leave messages to others, messages that can be read in the future. BUT, emails aren’t social networks, you don’t NEED to be trapped inside the email’s network in order for it to work. The only thing you need is an address, like a webpage. And it works from everywhere, with everybody.

peaked or left behind

Will emails get better? I think not. For the moment, it is perfect for what we do with it and will only get displaced when a new paradigm shift will occur. Yes, it’s usage has reduced since social networks are prevailing, but we still can’t imagine a world without it.

Meteor Trader : automatic trading using a custom-baked indicator

TLDR; see the gains here : Asteroid Labs Dashboard

Early on in our trading-chart analysis, we devised a simple yet effective indicator that wasn’t available anywhere. After successfully programming it ans testing it manually, it was time to hit the high road and automatate the hole thing.

This project is going to be built using NodeJS and Docker, for portability between my MacBook, RaspberryPi and Vultr instance.

goals

Basically, I want to program to gather data about markets and store it in a database, to keep a history. On demand, it will calculate the pairs’ Meteor Indicator from the (local) historic data, and send it to me using Telegram. Eventually, if it behaves well, I want the trading to be automated (which include : buy, sell orders as well as prices & quantity calculations, further market analysis, and probably losing money).

warmup

As with all exercises, we need some warmup first. In this case, we will query the last 20 candles from Binance, with which we can calculate the Meteor Indicator. Using zoeyg’s binance module, I can easily query market data from Binance using :

binanceRest.klines({
    symbol: symbol,
    interval: interval,
    limit: '100' 
})
.then((data) => {
    console.log(data);
})
.catch((err) => {
    console.error(err);
});

The resulting data contains a bunch of information on the specific candle, and we’ll store it in a MongoDB instance using the appropriate Schema. Now, because I want to query a list of pairs, I wrap it up in a function and call it inside a for loop. For the sake of your hair, watch out for NodeJS asychronicity. The process will not exit if mongoose still has a connection idle, so we need to manage that. I used a decreasing counter in the mongo.save() loop so that if there’s no more data to be saved, it will kill the connection.

feed

Binance has a very robust API and offers a WebSocket stream that we can connect to and receive pushed data. Using the same module, we can easily subscribe to a tickers’ candles stream :

binanceWS.onKline(symbol, interval, (data) => {
    if (data.kline.final == true){ // save only full candles
    var symbol = new Pair({eventType: data.eventType, eventTime: data.eventTime, symbol: data.symbol, interval: interval, currentClose: data.kline.close});
        symbol.save(function (err) {
            if (err) {
                console.log(err);
            }else {
                console.log('Added ' + data.symbol + ' with timestamp '+ data.eventTime + ' interval of '+ interval + ' to DB');
            }
        });  
    }else{} // skip the pushed update
});

Now, Binance’s stream will push updates every second, but I only want to save updates when the candle is closed, so I test if the kline is final.

bot

During the programming of the other two parts, I decided to make a Telegram bot, where I could type queries and it would reply with the current Meteor Indicator and prices for entry and exit, on different intervals. For a simple bot, I use yagop’s node-telegram-bot-api module. It’s very easy to get started, for example with the keyword “/echo”:

bot.onText(/\/echo (.+)/, (msg, match) => {
  // 'msg' is the received Message from Telegram
  // 'match' is the result of executing the regexp above on the text content
  // of the message

  const chatId = msg.chat.id; // User ID
  const resp = match[1]; // the captured "whatever"

  // send back the matched "whatever" to the chat
  bot.sendMessage(chatId, resp);
});

With this, I can use a keyword with multiple options to craft a very specific query. For example, using “/check buy 30m”, the bot will reply with every pair that have a Meteor Indicator suggesting a long position on the 30m interval. Pretty neat!

dockerfile(s)

First off, let’s start with a fresh and current Node image :

FROM node:10.12

RUN apt-get update

ADD / /opt/potency
WORKDIR /opt/potency

RUN npm install
RUN npm update

It is a very simple dockerfile, it only copies the documents to the image, then install all the npm modules. The docker file is the same for warmup, feed and bot. I know that I could combine everything under one container, but I’m not used to that and was a bit rushed so it was faster to use 3 containers.

docker-compose

Remember when I used MongoDB? Yeah, we need a mongo container too! This simple docker-compose file will build and start every container in this order : mongo, warmup, feed and bot.

version: "3"
services:
  warmup:
    build: ./warmup/
    command: node index.js
    depends_on:
      - mongo
  feed:
    build: ./feed/
    command: node index.js
    depends_on:
      - warmup
  bot:
    build: ./bot/
    command: node index.js
    depends_on:
      - feed
  mongo:
    image: mongo
    container_name: mongo
    environment:
      - discovery.type=single-node
    ports:
      - 27017:27017

In only one command, I can start the whole project from any computer/server I have, and scale it easily. Let’s see how successful my trades are with this!