In the middle of the desert you can say anything you want
docker run -it --name inception -p8080:8080 ghcr.io/inception-project/inception:35.1
$ docker run -it --name inception -v /srv/inception:/export -p8080:8080 ghcr.io/inception-project/inception:35.1
/srv/inception
Creating a project automatically fills it with sample data:
Tagsets
{
"name" : "BBK",
"description" : null,
"language" : null,
"tags" : [ {
"tag_name" : "aaa_human_processed",
"tag_description" : null
}, {
"tag_name" : "block",
"tag_description" : null
} ],
"create_tag" : false
}
A layer has to be linked to a feature (string) which then can be linked to a tagset: (INCEpTION User Guide)
annotations get saved automatically
in the viewer, you can set dynamic
for annotations differing based on color
Using uv as your shebang line – Rob Allen (HN comments) and more detailed article on this: Lazy self-installing Python scripts with uv
But especially Defining Python dependencies at the top of the file – Rob Allen and the PEP 723 – Inline script metadata | peps.python.org
You can add uv
to the shebang line as
#!/usr/bin/env -S uv run --script
And you can set requirements by adding this under the shebang line:
# /// script
# requires-python = ">=3.11"
# dependencies = [
# "flickrapi",
# ]
# ///
Then you can uv run sync-flickr-dates.py
Full package:
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.11"
# dependencies = [
# "flickrapi",
# ]
# ///
import flickrapi
print("\nI am running")
❯ chmod +x test.py
❯ ./test.py
Installed 11 packages in 134ms
I am running!
Neat!
git - How to cherry-pick multiple commits - Stack Overflow:
For one commit you just pase its hash.
For multiple you list them, in any order.
For a range, you do oldest-latest
but add ~
, ^
or ~1
to the oldest to include it. Quoting directly from the SO answer:
# A. INCLUDING the beginning_commit
git cherry-pick beginning_commit~..ending_commit
# OR (same as above)
git cherry-pick beginning_commit~1..ending_commit
# OR (same as above)
git cherry-pick beginning_commit^..ending_commit
# B. NOT including the beginning_commit
git cherry-pick beginning_commit..ending_commit
So, given that kubectl cp
was never reliable ever for me, leading to many notes here, incl. 250115-1052 Rancher much better way to copy data to PVCs with various hacks and issues like 250117-1127 Splitting files, 250117-1104 Unzip in alpine is broken issues etc. etc. etc.
For many/large files, I’d have used rsync
, for which ssh access is theoretically needed. Not quite!
rsync files to a kubernetes pod - Server Fault
ksync.sh
(EDIT Updated by ChatGPT to support files with spaces):
if [ -z "$KRSYNC_STARTED" ]; then
export KRSYNC_STARTED=true
exec rsync --blocking-io --rsh "$0" "$@"
fi
# Running as --rsh
namespace=''
pod=$1
shift
# If user uses pod@namespace, rsync passes args as: {us} -l pod namespace ...
if [ "X$pod" = "X-l" ]; then
pod=$1
shift
namespace="-n $1"
shift
fi
# Execute kubectl with proper quoting
exec kubectl $namespace exec -i "$pod" -- "$@"
Usage is same as rsync basically :
./ksync.sh -av --info=progress2 --stats /local/dir/to/copy/ PODNAME@NAMESPACE:/target/dir/
(Or just --progress
for per-file instead of total progress).
Rsync needs to be installed on server for this to work.
For flaky connections (TODO document better): -hvvrPt --timeout1
and while ! rsync ..; do sleep 5; doen
1
TL;DR pipx inject target_app package_to_inject
pipx psutil
it refuses, it’s a library, not an app
psutil
for the MemoryGraph widget in (pipx install
-ed) qtile, that doesn’t helppipx inject qtile psutil
❯ pipx inject qtile psutil
injected package psutil into venv qtile
done! ✨ 🌟 ✨
If no real config thingy is required/wanted, then this works (stolen from Parsing Dictionary-Like Key-Value Pairs Using Argparse in Python | Sumit’s Space)1:
def parse_args():
class ParseKwargs(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, dict())
for value in values:
key, value = value.split("=")
getattr(namespace, self.dest)[key] = value
parser.add_argument("--no-pics", action="store_true", help="Predict only on videos")
# ...
parser.add_argument(
"-k",
"--kwargs",
nargs="*",
action=ParseKwargs,
help="Additional inference params, e.g.: batch=128, conf=0.2.",
)
interesting mix of topics on that website ↩︎
#!/bin/bash
BATTINFO=$(acpi -b)
LIM="00:15:00"
if grep Discharging) && $(echo $BATTINFO | cut -f 5 -d " ") < $LIM ; then
# DISPLAY=:0.0 /usr/bin/notify-send "low battery" "$BATTINFO"
dunstify "low battery" "$BATTINFO"
fi
For this, install and run on startup dunst
, then cron job for the above.
E.g. to upload it somewhere where it’s hard to upload large files
See also: 250117-1104 Unzip in alpine is broken
# split
split -b 2G myfile.zip part_
# back
cat part_* > myfile.zip
TL;DR alpine’s unzip
is busyboxes, and fails for me with
/data/inference_data # unzip rd1.zip
Archive: rd1.zip
unzip: short read
apk add unzip
installs the same real one I have on all other computers, and then it works.
spec:
affinity:
nodeAffinity:
requiredDuringSchedulingIgnoredDuringExecution:
nodeSelectorTerms:
- matchExpressions:
- key: kubernetes.io/hostname
operator: NotIn
values:
- node_to_avoid
(operator: In
for the list of the allowed nodes)