Compare commits

...

223 Commits
dev ... master

Author SHA1 Message Date
7ad8e4a781 [+] fix ruff execution in cli 2025-04-14 17:32:25 +03:00
8d43813c37 [+] add linters 2025-04-12 16:22:21 +03:00
ae4fcb16f2 [+] fix ninja tests 2025-04-08 16:42:47 +03:00
7fa2a8a83c [+] update platform
1. update udev for air 2012;
  2. update commands for air 2012;
2025-04-04 12:40:09 +03:00
fa732867c3 [+] tune power management for air 2018 2025-04-01 12:40:53 +03:00
61584fedac [+] fix intel_pstate cpufreq 2025-03-31 17:35:08 +03:00
42ce6ffbff [+] update pr34
1. update m.py;
  2. add intel_pstate profile for cpufreq;
  3. build 0.1.5.0;
2025-03-31 17:15:53 +03:00
385e82bab8 [+] add dotfiles for platform macbook air 2018 2025-03-31 17:04:57 +03:00
ccb0fb09c9 [+] ad platform dotfiles 2025-03-31 17:04:56 +03:00
6d184edec8 [+] update systemd 2025-03-25 14:52:31 +03:00
6374849014 [+] migrate to systemd 2025-03-20 19:20:55 +03:00
3d023047b4 [+] update cpanel
1. fix logging;
  2. refactor into a class;
2025-03-20 19:07:34 +03:00
43c711cb2c [+] add make
1. add venv_compile;
  2. add venv;
  3. add mypy;
  4. update d1/cpanel.py
    to aggregate .ssh/known_hosts files;
  5. add systemd service for a gateway;
2025-03-20 19:03:05 +03:00
dff3782834 [+] integrate with podman 2025-03-20 17:02:18 +03:00
b399c5546c [+] update pr34
1. update katerc;
  2. add mime.types to few some source code
    as plain text in firefox;
  3. update .whl release for pr34;
  4. update commands_typed/typing.py;
2025-03-18 10:02:27 +03:00
c1e598b3ab [+] update nginx multiplexing 2025-03-15 13:40:38 +03:00
7442368b03 [+] update gateway
1. add systemd units deployment recipie;
  2. add certbot periodic task;
  3. update nginx_config.py
    to user ssl_preread_server_name
    instead of protocol, since it seems
    to be broken;
2025-03-15 12:06:52 +03:00
4cf720ee17 [+] add .whl 2025-03-13 19:35:11 +03:00
93f023dda3 [+] fix percentage 2025-03-13 19:32:17 +03:00
e06a1f4007 [+] add env for meson_setup 2025-03-13 12:04:54 +03:00
9ac87fb3df [+] update cli_bootstrap
1. generate requirements with hashes;
2025-03-12 19:16:28 +03:00
c8b6d96b01 [+] add .lock generation, fix mypy 2025-03-12 11:02:20 +03:00
cc0acd6f13 [+] add .whl 2025-03-11 14:19:42 +03:00
acf1850d70 [+] add env for ninja 2025-03-11 14:14:13 +03:00
9e117048dc [+] add parse_args 2025-03-08 18:39:35 +03:00
723c5b6677 [+] pack .whl release 2025-03-04 18:49:12 +03:00
aaf8b12549 [+] check invalid password 2025-03-04 18:39:48 +03:00
06e79d0679 [+] fix secret_check 2025-03-04 18:38:52 +03:00
ff786e3ce6 [+] update pr34
1. add dotfiles deploy via .tar.xz;
  2. update .sway/config;
  3. update test_crypto.py;
2025-03-04 15:44:20 +03:00
add9d858d8 [+] improve crypto 2025-03-03 18:00:51 +03:00
731c507b95 [+] add tests for pr34 2025-03-03 17:57:14 +03:00
62063a1448 [+] update .p43
1. partially improve crypto;
  2. fix m.py;
2025-03-03 12:53:58 +03:00
1fb4e4efc5 [+] update pr34
1. partially add crypto logic;
  2. fetch updated commands from laptop;
  3. update Makefile;
2025-03-03 09:51:26 +03:00
b12395621d [+] update desktop-services 2025-02-22 18:54:10 +03:00
905241a068 [+] update cli 2025-02-18 18:29:48 +03:00
01aab0517a [+] improve pip_resolve
1. add -r flag
    that can parse complicated
    requirements
    and format into a temp file
    for uv pip compile;
2025-01-24 21:47:43 +03:00
6ddfc7d2d7 [+] update dotfiles
1. add kate editor config;
2025-01-24 21:25:13 +03:00
3245d6d7e5 [+] add uv_pip_freeze
1. uv_pip_compile
    just a raw wrapper
    around uv pip compile --generate-hashes;
  2. uv_pip_freeze
    does compile of frozen dependencies
    with generated hashes along the way;
2025-01-24 21:15:28 +03:00
a529db106a [+] fix mypy errors 2025-01-24 21:06:26 +03:00
57f74df865 [+] fix some mypy errors 2025-01-24 18:27:51 +03:00
528d9b1ce5 [+] integrate uv pip compile 2025-01-23 12:17:07 +03:00
ebbd1a2b5b [+] raise to 0.1.4.9 2025-01-18 21:22:56 +03:00
62cfbf36cb [+] fix --hash format 2025-01-18 21:13:26 +03:00
d643e8f97b [+] improve pip_resolve
1. do not download anything;
  2. provide hashes per dependency;
  3. test that works on numpy
2025-01-18 20:56:03 +03:00
9b5fff93c0 [+] add pip_resolve
1. add freezing packages based on pip resolver;
  1.1. the command
    should resolve final packages with versions,
    and provide constraints with file hashes;
    to act alike go mod tool;
2025-01-18 19:34:46 +03:00
ff22d9311b [+] update
1. fix python3.13 regression
    with class property;
  2. update books dependency;
2025-01-18 12:43:28 +03:00
759ce361e3 [+] fix .gitmodules 2025-01-12 18:23:47 +03:00
136b5709b0 [+] refactor book1
1. move book1 into a private repo;
2025-01-12 18:20:19 +03:00
584b4b652f [+] add submodule books 2025-01-12 11:02:29 +03:00
7d1d887692 [+] update keybindigs 2025-01-02 10:42:58 +03:00
212c8c8086 [+] improve deploy:wheel command 2025-01-01 02:19:40 +03:00
6599115a68 [+] 0.1.4.7, improve os module
1. add runtime initialization;
    works for .so files;
    linux platform only at the moment;
2024-12-30 15:03:45 +03:00
22f5f0fba3 [+] raise 0.1.4.1 2024-12-29 14:44:07 +03:00
c9382162de [+] update cli logic
1. add pip_sync method;
  1.1. rely on -f flag
    of pip to use a custom cache dir;
2024-12-29 14:36:30 +03:00
0cee9beaea [+] improve mypy step while deploy:wheel 2024-12-22 21:07:05 +03:00
93437359a8 [+] fix mypy errors 2024-12-22 21:02:35 +03:00
9c8b554acc [+] update pr34
1. update dependencies handling
    during venv creation;
  2. update cli and cli_boostrap modules;
2024-12-22 20:45:46 +03:00
adc3fd0205 [+] fix mypy errors 2024-12-22 19:18:06 +03:00
48ef23aa88 [+] update vscode dot files 2024-12-22 18:38:29 +03:00
1626974759 [+] update packaging
1. update cli and cli_bootstrap;
  1.1. use /meson and /pyproject
    build dirs to fix collision
    for install step;
2024-12-22 18:33:06 +03:00
ea63c67280 [+] update cli module; 2024-12-22 12:06:07 +03:00
34c65f7ba5 [+] update commands
1. add basic cli.py module;
  2. add stubs for debugpy;
2024-12-22 10:31:31 +03:00
dedff2aee5 [+] update repo 2024-12-12 17:03:33 +03:00
cb69309307 [+] update repo
1. add asyncio.handle_task_result;
  2. improve pass_ssh_osx;
  3. update vpn dependency;
2024-12-10 20:28:00 +03:00
0a50c26d1d [+] add qrcode mode for pass ssh osx 2024-12-05 09:03:22 +03:00
a6cdf03523 [+] fix exit status for commands
1. fix intercept_output when the process has died
    before select poll has been started;
2024-12-04 20:28:20 +03:00
eb457950d3 [+] add launch.json 2024-12-04 19:04:06 +03:00
4afe4048d9 [+] update debugging
1. disable telemtry in ms-python;
  1.1. TODO, use a forked version of the plugin;
  2. add debug module into pr34;
  3. enable show strack frame for all .py files;
2024-12-04 19:01:30 +03:00
74cc54ae85 [~] update vpn 2024-12-03 19:07:43 +03:00
fd0dbb0c4a [+] update vpn dependency 2024-12-03 09:25:42 +03:00
d0b696206c [r] update versions of pip packages 2024-12-03 09:21:45 +03:00
92966ca86d [~] Refactor 2024-12-03 09:17:16 +03:00
d38022b5a6 [+] update packaging 2024-12-01 18:49:12 +03:00
796fb51ad9 [~] Refactor 2024-12-01 09:04:58 +03:00
b32b058083 [~] Refactor 2024-12-01 08:49:34 +03:00
d3d5e3bcfb [~] Refactor 2024-11-30 18:59:22 +03:00
64706907ca [~] Refactor 2024-11-29 21:17:34 +03:00
8656b3b985 [~] Refactor 2024-11-28 12:24:38 +03:00
1cfc6c52f3 [~] Refactor 2024-11-25 16:56:28 +03:00
62f4d9f948 [~] Refactor 2024-11-25 12:41:06 +03:00
5c69e922a7 [~] Refactor 2024-11-24 22:47:42 +03:00
27af208437 [~] Refactor 2024-11-24 22:35:54 +03:00
fadd95cef3 [~] Refactor 2024-11-24 19:54:24 +03:00
bd9878335a [~] add host_deps command 2024-11-24 19:47:36 +03:00
cadec5376a [~] Refactor 2024-11-24 19:34:24 +03:00
0ef9268f48 [~] Refactor 2024-11-24 02:28:19 +03:00
60929176ef [~] Refactor 2024-11-24 00:26:23 +03:00
0d9e225a76 [~] Refactor 2024-11-23 23:56:23 +03:00
3321e4c165 [~] Refactor 2024-11-23 23:51:06 +03:00
dffcb96358 [~] Refactor 2024-11-22 23:13:18 +03:00
fc18a91f63 [~] Refactor 2024-11-22 23:11:57 +03:00
34c3ed74eb [~] Refactor 2024-11-22 23:08:18 +03:00
2522ed4ac4 [~] Refactor 2024-11-22 22:51:31 +03:00
6d1c845d74 [~] Refactor 2024-11-22 21:24:07 +03:00
83b1177f85 [~] Refactor 2024-11-21 10:17:37 +03:00
9620bb2b25 [~] Refactort 2024-11-15 20:57:06 +03:00
f9f18df737 [~] Refactor 2024-11-15 20:47:41 +03:00
14b499637b [~] Refactor 2024-11-15 20:47:41 +03:00
40350d128a [~] Refactor 2024-11-15 20:47:40 +03:00
1eecc616f5 [~] Refactor 2024-11-15 20:47:40 +03:00
c4944ede7f [~] Refactor 2024-11-15 20:47:40 +03:00
43985cd5b0 [~] Refactor 2024-11-15 20:47:40 +03:00
ed347e1d21 [~] Refactor 2024-11-15 20:47:40 +03:00
32579007e4 [~] Refactor 2024-11-15 20:47:40 +03:00
82a7c62ca8 [~] Refactor 2024-11-15 20:47:40 +03:00
de05d2cd05 [~] Refactor 2024-11-15 20:47:40 +03:00
723ad56ca9 [~] Refactor 2024-11-15 20:47:40 +03:00
eb310ceef7 [~] Refactor 2024-11-15 20:47:40 +03:00
81bf066e82 [~] Refactor 2024-11-15 20:47:40 +03:00
560e748867 [~] Refactor 2024-11-02 12:26:03 +03:00
48ce6c87a6 [~] Refactor 2024-09-30 23:29:37 +03:00
58a1e5f1be [~] Refactor 2024-09-30 22:27:35 +03:00
62b809bd3c [~] Refactor 2024-09-30 22:26:50 +03:00
6b7bef9c07 [~] Refactor 2024-09-22 17:43:24 +03:00
16a075b19c [~] Refactor 2024-09-18 00:00:51 +03:00
92fe90b042 [~] Refactor 2024-09-08 21:52:10 +03:00
a7f5be4c8e [~] Refactor 2024-08-24 12:42:13 +03:00
392faf5526 [~] Refactor 2024-08-24 12:40:15 +03:00
fd6378a643 [~] Refactor 2024-08-24 12:26:16 +03:00
32ee4316f1 [~] Refactor 2024-08-17 11:25:08 +03:00
1140a46799 [~] Refactor 2024-08-17 11:21:05 +03:00
1ad4fab099 [~] Refactor 2024-08-17 11:18:09 +03:00
edea7a4fab [~] Refactor 2024-08-17 11:13:14 +03:00
e5cb3bbb53 [~] Refactor 2024-08-17 11:04:36 +03:00
f4f214d75b [~] Refactor 2024-08-17 11:01:39 +03:00
27be8c3104 [~] Refactor 2024-08-17 10:46:51 +03:00
c95503fa76 [~] Refactor 2024-08-17 10:46:18 +03:00
ce77de626c [~] Refactor 2024-08-17 10:44:30 +03:00
f2b8862683 [~] Refactor 2024-08-17 10:22:22 +03:00
8dbedd9c28 [~] Refactor 2024-08-17 10:04:32 +03:00
a9393dbff2 [~] Refactor 2024-08-17 10:02:25 +03:00
4ed4dbcbdb [~] Refactor 2024-08-17 09:47:31 +03:00
d0177e7255 [~] Refactor 2024-08-17 09:43:40 +03:00
10be7dc897 [~] Refactor 2024-08-17 09:26:59 +03:00
d7d4260053 [~] Refactor 2024-08-17 09:23:03 +03:00
d4ef7d5ba4 [~] Refactor 2024-08-17 09:16:35 +03:00
b5343c1375 [~] Refactor 2024-08-17 09:10:16 +03:00
a9e8d8a505 [~] Refactor 2024-08-17 09:07:49 +03:00
de292518a5 [~] Refactor 2024-08-17 09:05:15 +03:00
3e9708d442 [~] Refactor 2024-08-17 08:53:54 +03:00
9ad2030f36 [~] Refactor 2024-08-17 01:15:07 +03:00
aa8d84614c [~] Refactor 2024-08-17 01:08:58 +03:00
13f0899b90 [~] Refactor 2024-08-17 00:50:24 +03:00
2385012e35 [~] Refactor 2024-08-17 00:38:20 +03:00
2b3305ab56 [~] Refactor 2024-08-17 00:37:29 +03:00
a67d765569 [~] Refactor 2024-08-17 00:29:20 +03:00
3163641646 [~] Refactor 2024-08-16 17:43:29 +03:00
d06a7655d7 [~] Refactor 2024-08-16 17:24:14 +03:00
ac7515f49d [~] Refactor 2024-08-14 23:49:04 +03:00
d3718cf271 [~] Refactor 2024-08-14 23:45:17 +03:00
45d0be60f9 [~] Refactor 2024-08-14 23:35:01 +03:00
03164b8e34 [~] Refactor 2024-08-14 23:32:53 +03:00
bd89fff408 [~] Refactor 2024-08-14 23:29:10 +03:00
7444095e03 [~] Refactor 2024-08-14 23:17:43 +03:00
1018ad3266 [~] Refactor 2024-08-12 23:19:21 +03:00
d6f5817c78 [~] Refactor 2024-08-12 23:16:10 +03:00
1e55e7baac [~] Refactor 2024-08-12 23:13:56 +03:00
b9297f6863 [~] Refactor 2024-08-12 23:02:41 +03:00
ae897835b2 [~] Refactor 2024-08-12 23:01:51 +03:00
633e1c6424 [~] Refactor 2024-08-12 22:59:56 +03:00
d782fbfbf2 [~] Refactor 2024-08-12 21:35:42 +03:00
aead4c165d [~] Refactor 2024-08-12 21:33:40 +03:00
d50f17eb76 [~] Refactor 2024-08-12 21:31:08 +03:00
f656d94fa3 [~] Refactor 2024-08-12 03:02:30 +03:00
b6192902ff [~] Refactor 2024-08-12 02:56:40 +03:00
d5c616806a [~] Refactor 2024-08-12 02:54:11 +03:00
184340db4f [~] Refactor 2024-08-12 02:45:33 +03:00
58071843be [~] Refactor 2024-08-12 02:44:10 +03:00
b93ee0b7e4 [~] Refactor 2024-08-12 02:42:12 +03:00
71c793cbae [~] Refactor 2024-08-12 02:29:33 +03:00
5c271c518f [~] Refactor 2024-08-12 02:28:10 +03:00
4909d474b1 [~] Refactor 2024-08-12 02:27:11 +03:00
430b517ce7 [~] Refactor 2024-08-12 02:24:25 +03:00
11a3c59961 [~] Refactor 2024-08-12 02:22:28 +03:00
c16f389324 [~] Refactor 2024-08-12 02:20:46 +03:00
4758032a35 [~] Refactor 2024-08-12 02:18:47 +03:00
c1b7bb71b3 [~] Refactor 2024-08-12 02:11:26 +03:00
8ec95247f9 [~] Refactor 2024-08-12 02:08:54 +03:00
b3156b5093 [~] Refactor 2024-08-12 01:09:24 +03:00
f93b604c51 [~] Refactor 2024-08-12 00:39:06 +03:00
437e073411 [~] Refactor 2024-08-12 00:35:31 +03:00
99bc0d5ab1 [~] Refactor 2024-08-12 00:25:08 +03:00
42bfe00ee5 [~] Refactor 2024-08-12 00:23:40 +03:00
45626d85d4 [~] Refactor 2024-08-12 00:12:58 +03:00
cf849b1009 [~] Refactor 2024-08-12 00:09:51 +03:00
c867c7d828 [~] Refactor 2024-08-12 00:07:23 +03:00
0d7c0d9d09 [~] Refactor 2024-08-12 00:03:35 +03:00
9725a1493c [~] Refactor 2024-08-11 23:59:45 +03:00
15611baed2 [~] Refactor 2024-08-11 23:53:12 +03:00
010b426b03 [~] Refactor 2024-08-11 23:50:50 +03:00
826aec8f3f [~] Refactor 2024-08-11 23:37:47 +03:00
4b93c33d66 [~] Refactor 2024-08-11 23:31:41 +03:00
495e901a23 [~] Refactor 2024-08-11 23:25:02 +03:00
29053ceb5d [~] Refactor 2024-08-11 23:23:57 +03:00
1411dc60e5 [~] Refactor 2024-08-11 23:22:43 +03:00
12c55aa0e8 [~] Refactor 2024-08-11 23:20:45 +03:00
508c55bd68 [~] Refactor 2024-08-11 00:38:49 +03:00
c57b449f69 [~] Refactor 2024-08-11 00:34:39 +03:00
45945c0894 [~] Refactor 2024-08-11 00:30:25 +03:00
fb8b741ca0 [~] Refactor 2024-08-11 00:29:14 +03:00
e6038b7060 [~] Refactor 2024-08-11 00:25:15 +03:00
e1aa8edd42 [~] Refactor 2024-08-11 00:14:10 +03:00
39fa0f23ea [~] Refactor 2024-08-11 00:10:20 +03:00
852c15635f [~] Refactor 2024-08-11 00:00:08 +03:00
b1339cad6b [~] Refactor 2024-08-10 23:55:35 +03:00
01a09989ef [~] Refactor 2024-08-10 23:52:53 +03:00
f2926b6f2b [~] Refactor 2024-08-10 22:07:39 +03:00
e60308a28d [~] Refactor 2024-08-10 21:54:29 +03:00
b8b2443c2a [~] Refactor 2024-08-10 21:52:41 +03:00
3f49dd337f [~] Refactor 2024-08-10 21:51:10 +03:00
331e11d728 [~] Refactor 2024-08-10 21:49:26 +03:00
3b2dd8045e [~] Refactor 2024-08-10 21:37:33 +03:00
bbcf3fb16d [~] Refactor 2024-08-10 21:33:29 +03:00
25c9f86ef0 [~] Refactor 2024-08-10 21:32:28 +03:00
16cfa6c481 [~] Refactor 2024-08-10 21:30:58 +03:00
04f8dca7e6 [~] Refactor 2024-08-10 21:19:59 +03:00
a7993b8a82 [~] Refactor 2024-08-10 21:18:59 +03:00
c00d06368f [~] Refactor 2024-08-10 21:18:34 +03:00
b4836b6514 [~] Refactor 2024-08-10 21:16:52 +03:00
11cd349a7e [~] Refactor 2024-08-10 21:11:45 +03:00
34ae744866 [~] Refactor 2024-08-10 21:09:55 +03:00
99c1c3c80b [~] Refactor 2024-07-01 09:53:06 +03:00
9fe12b0be1 [~] Refactor 2024-07-01 09:36:11 +03:00
118 changed files with 21730 additions and 5789 deletions

2
.gitattributes vendored Normal file

@ -0,0 +1,2 @@
releases/tar/** filter=lfs diff=lfs merge=lfs -text
releases/whl/** filter=lfs diff=lfs merge=lfs -text

13
.gitignore vendored

@ -2,3 +2,16 @@ tmp
__pycache__
d2/book1/books
.DS_Store
.vim
*.so
.mypy_cache
.ruff_cache
.tmuxp
*.egg-info
*.whl
*.tar.gz
.vscode/*
!.vscode/launch.json
python/build
.*.kate-swp
!releases/whl/*.whl

6
.gitmodules vendored

@ -16,3 +16,9 @@
[submodule "deps/melianmiko-mb7-apps"]
path = deps/melianmiko-mb7-apps
url = https://notabug.org/melianmiko/mb7_apps
[submodule "deps/com.github.aiortc.aiortc"]
path = deps/com.github.aiortc.aiortc
url = https://gitea.fxreader.online/nartes/com.github.aiortc.aiortc
[submodule "deps/online.fxreader.nartes.books"]
path = deps/online.fxreader.nartes.books
url = https://gitea.fxreader.online/nartes/books.git

18
.mypy.ini Normal file

@ -0,0 +1,18 @@
[mypy]
mypy_path =
mypy-stubs,
deps/com.github.aiortc.aiortc/src,
mypy-stubs/marisa-trie-types,
mypy-stubs/types-debugpy,
python
exclude =
python/tmp,
python/build
plugins =
numpy.typing.mypy_plugin,
pydantic.mypy
explicit_package_bases = true
namespace_packages = true

49
.vscode/launch.json vendored Normal file

@ -0,0 +1,49 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
/*
{
"name": "Python Debugger: Module",
"type": "debugpy",
"request": "launch",
"module": "online_fxreader.vpn.vpn",
},
{
"name": "Python Debugger: Current File with Arguments",
"type": "debugpy",
"request": "launch",
"program": "${file}",
"console": "integratedTerminal",
"args": [
"${command:pickArgs}"
]
},
*/
{
"name": "Python Debugger: Remote Attach",
"type": "debugpy",
"request": "attach",
"connect": {
"host": "127.0.0.1",
"port": 4444
},
"pathMappings": [
/*
{
"localRoot": "${workspaceFolder}/deps/com.github.aiortc.aiortc/src/",
//"remoteRoot": "."
"remoteRoot": "~/.local/bin/env3/lib/python3.12/site-packages/",
},
{
"localRoot": "${workspaceFolder}/deps/com.github.aiortc.aiortc/",
//"remoteRoot": "."
"remoteRoot": "~/.local/bin/env3/lib/python3.12/site-packages/",
}
*/
]
}
]
}

140
Makefile Normal file

@ -0,0 +1,140 @@
.PHONY: python_clean_online_fxreader_vpn
host_deps:
./m.py host_deps
python_lint:
./m.py mypy -- -f vscode 2>&1 | less
python_tests:
./m.py tests
#python_clean_online_fxreader_vpn:
# rm -fr \
# deps/com.github.aiortc.aiortc/src/online_fxreader/vpn/dist;
PYTHON_PROJECTS ?= \
deps/com.github.aiortc.aiortc/ \
deps/com.github.aiortc.aiortc/src/online_fxreader/vpn/ \
python
INSTALL_ROOT ?= ~/.local/bin
#python_clean: python_clean_online_fxreader_vpn
python_clean_env:
rm -fr \
$(INSTALL_ROOT)/env3;
python_put_env:
[[ -d $(INSTALL_ROOT)/env3 ]] || (\
uv venv --system-site-packages --seed $(INSTALL_ROOT)/env3 && \
$(INSTALL_ROOT)/env3/bin/python3 -m pip install --force-reinstall uv \
);
python_clean_dist:
for o in $(PYTHON_PROJECTS); do \
[[ -d $$o/dist ]] || continue; \
echo $$o/dist; \
rm -fr $$o/dist; \
done
python_clean: python_clean_dist python_clean_env
UV_ARGS ?= --offline
python_put_dist:
for f in \
$(PYTHON_PROJECTS); do \
[[ -d $$f/dist ]] && continue; \
echo $$f; \
python3 -m build -n $$f; \
$(INSTALL_ROOT)/env3/bin/python3 -m uv pip install $(UV_ARGS) $$f/dist/*.whl; \
done
ln -sf $(INSTALL_ROOT)/env3/bin/online-fxreader-pr34-commands $(INSTALL_ROOT)/commands
PYTHON_PROJECTS_NAMES ?= online.fxreader.pr34
python_whl:
for f in $(PYTHON_PROJECTS_NAMES); do \
./m.py deploy:wheel -o releases/whl -p $$f; \
done
python_put: python_put_dist python_put_env
dotfiles_put:
mkdir -p $(INSTALL_ROOT)
cp dotfiles/.local/bin/gnome-shortcuts-macbook-air $(INSTALL_ROOT)/
mkdir -p ~/.sway
cp dotfiles/.sway/config ~/.sway/config
cp dotfiles/.zshenv ~/.zshenv
cp dotfiles/.zshrc ~/.zshrc
cp dotfiles/.vimrc ~/.vimrc
cp dotfiles/.tmux.conf ~/.tmux.conf
cp dotfiles/.py3.vimrc ~/.py3.vimrc
cp dotfiles/.py3.vimrc ~/.py3.vimrc
cp dotfiles/.gitconfig ~/.gitconfig
cp -rp \
dotfiles/.ipython/profile_default/ipython_config.py \
~/.ipython/profile_default/ipython_config.py
D1=Code\ -\ OSS; \
for p in \
"dotfiles/.config/$$D1/User/keybindings.json" \
"dotfiles/.config/$$D1/User/settings.json"; do \
commands install -f -p "dotfiles/.config/$$D1" -s "$$p" -t ~/.config/"$$D1"; \
done
#commands install -f -p dotfiles -s dotfiles/ -t ~/.config/
PLATFORM ?= macbook_air_2012
PLATFORM_TMP ?= tmp/platform_dotfiles/$(PLATFORM)
dotfiles_put_platform:
@echo to be installed
find platform_dotfiles/$(PLATFORM);
echo remove $(PLATFORM_TMP)'?'; read; sudo rm -fr $(PLATFORM_TMP)
sudo mkdir -p $(PLATFORM_TMP)
sudo cp -rp -T platform_dotfiles/$(PLATFORM)/ $(PLATFORM_TMP)
sudo chown -R root:root $(PLATFORM_TMP)
sudo cp -rp -T $(PLATFORM_TMP) /
sudo udevadm control --reload
sudo systemctl daemon-reload
dotfiles_fetch:
commands install -f -p ~ -s ~/.config/katerc -t dotfiles
commands install -f -p ~ -s ~/.mime.types -t dotfiles
commands install -f -p ~ -s ~/.config/Code\ -\ OSS/User/keybindings.json -t dotfiles
commands install -f -p ~ -s ~/.config/Code\ -\ OSS/User/settings.json -t dotfiles
DOTFILES_VERSION ?= 0.1
dotfiles_deploy:
mkdir -p releases/tar
tar -cvf - \
dotfiles \
| xz --compress -9 --stdout > \
releases/tar/dotfiles-$(DOTFILES_VERSION).tar.xz
systemd:
/usr/bin/env python3 d1/systemd.py
for d in tmp/d1; do \
(\
cd $$d; \
for i in *.service *.timer; do \
sudo ln -s -f $$PWD/$$i /etc/systemd/system/$$i; \
done; \
); \
done
sudo systemctl daemon-reload
venv:
uv venv
uv pip install -p .venv \
-r requirements.txt
venv_compile:
uv pip compile --generate-hashes \
requirements.in > requirements.txt
MYPY_SOURCES ?= \
d1/cpanel.py
mypy:
. .venv/bin/activate && \
mypy --strict --follow-imports silent \
$(MYPY_SOURCES)

19
d1/certbot.py Normal file

@ -0,0 +1,19 @@
#!/usr/bin/env python3
import subprocess
import time
import logging
logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.INFO)
while True:
subprocess.check_call([
'docker', 'compose', 'exec', 'ssl-app', 'certbot', 'renew',
])
subprocess.check_call([
'docker', 'compose', 'exec', 'ssl-app', 'nginx', '-s', 'reload',
])
break

@ -1,4 +1,5 @@
import subprocess
import os
import requests
import sys
import io
@ -10,103 +11,131 @@ import logging
import json
import time
with io.open(
'tmp/d1/cpanel.json', 'r'
) as f:
t3 = json.load(f)
logger = logging.getLogger(__name__)
t2 = copy.deepcopy(t3)
for k in t2:
v = t2[k]
v['task'] = lambda : subprocess.Popen(
v['task_cmd'],
stdin=subprocess.DEVNULL,
)
def stop_task(task):
task.terminate()
try:
task.wait(1)
except:
task.kill()
class Launcher:
def run(self):
logging.basicConfig(level=logging.INFO)
t1 = dict()
with io.open(
'tmp/d1/cpanel.json', 'r'
) as f:
t3 = json.load(f)
shutdown = False
t2 = copy.deepcopy(t3)
ssh_known_hosts : list[str] = []
while True:
try:
for k, v in t2.items():
if not k in t1:
logging.info(json.dumps(dict(
task=k,
status='starting',
)))
t1[k] = v['task']()
logging.info(json.dumps(dict(
task=k,
status='started',
)))
continue
if 'ssh_known_hosts' in v:
ssh_known_hosts.append(v['ssh_known_hosts'])
o = t1[k]
if len(ssh_known_hosts) > 0:
subprocess.check_call(
r'''
mkdir -p ~/.ssh && \
cat $SSH_KNOWN_HOSTS > ~/.ssh/known_hosts
''', env=dict(list(os.environ.items())) | dict(
SSH_KNOWN_HOSTS=' '.join(ssh_known_hosts),
),
shell=True
)
not_alive = None
for k in t2:
v = t2[k]
v['task'] = lambda : subprocess.Popen(
v['task_cmd'],
stdin=subprocess.DEVNULL,
)
def stop_task(task: subprocess.Popen[bytes]) -> None:
task.terminate()
try:
not_alive = not (
requests.get(v['url'], timeout=0.5).status_code
== 200
)
task.wait(1)
except:
logging.error(json.dumps(dict(
error=traceback.format_exc(),
time_iso=datetime.datetime.now().isoformat(),
)))
not_alive = True
task.kill()
if not_alive:
logging.error(json.dumps(
dict(
args=o.args,
k=k,
#o=pprint.pformat(o.__dict__),
status='not_alive',
time_iso=datetime.datetime.now().isoformat(),
)
))
t1 = dict()
#stop_task(o)
#del t1[k]
continue
shutdown = False
if not o.poll() is None:
logging.error(json.dumps(
dict(
#o=pprint.pformat(o.__dict__),
args=o.args,
k=k,
return_code=o.poll(),
status='crashed',
time_iso=datetime.datetime.now().isoformat(),
)
))
del t1[k]
continue
while True:
try:
for k, v in t2.items():
if not k in t1:
logging.info(json.dumps(dict(
task=k,
status='starting',
)))
t1[k] = v['task']()
logging.info(json.dumps(dict(
task=k,
status='started',
)))
continue
if shutdown:
break
o = t1[k]
print('\r%s tasks %d' % (
datetime.datetime.now().isoformat(),
len(t1),
), end='')
sys.stdout.flush()
except KeyboardInterrupt:
print('\nshutting down')
break
finally:
time.sleep(5 * 60)
not_alive = None
for o in t1:
stop_task(o)
try:
not_alive = not (
requests.get(v['url'], timeout=0.5).status_code
== 200
)
except:
logging.error(json.dumps(dict(
error=traceback.format_exc(),
time_iso=datetime.datetime.now().isoformat(),
)))
not_alive = True
if not_alive:
logging.error(json.dumps(
dict(
args=o.args,
k=k,
#o=pprint.pformat(o.__dict__),
status='not_alive',
time_iso=datetime.datetime.now().isoformat(),
)
))
#stop_task(o)
#del t1[k]
continue
if not o.poll() is None:
logging.error(json.dumps(
dict(
#o=pprint.pformat(o.__dict__),
args=o.args,
k=k,
return_code=o.poll(),
status='crashed',
time_iso=datetime.datetime.now().isoformat(),
)
))
del t1[k]
continue
if shutdown:
break
print('\r%s tasks %d' % (
datetime.datetime.now().isoformat(),
len(t1),
), end='')
sys.stdout.flush()
except KeyboardInterrupt:
print('\nshutting down')
break
finally:
time.sleep(5 * 60)
for o in t1:
stop_task(o)
if __name__ == '__main__':
Launcher().run()

@ -1,15 +0,0 @@
#!/bin/sh
mkdir -p ~/.local/bin
cp dotfiles/.local/bin/commands ~/.local/bin/commands
mkdir -p ~/.sway
cp dotfiles/.sway/config ~/.sway/config
cp dotfiles/.zshenv ~/.zshenv
cp dotfiles/.zshrc ~/.zshrc
cp dotfiles/.vimrc ~/.vimrc
cp dotfiles/.py3.vimrc ~/.py3.vimrc
cp dotfiles/.py3.vimrc ~/.py3.vimrc
cp dotfiles/.gitconfig ~/.gitconfig
cp -rp \
dotfiles/.ipython/profile_default/ipython_config.py \
~/.ipython/profile_default/ipython_config.py

@ -0,0 +1,11 @@
[Unit]
Description=fxreader.online-certbot
[Service]
Type=oneshot
ExecStart=/usr/bin/python3 d1/certbot.py
WorkingDirectory={{PROJECT_ROOT}}
#Restart=always
#[Install]
#WantedBy=multi-user.target

@ -0,0 +1,9 @@
[Unit]
Description=fxreader.online-certbot-timer
[Timer]
OnUnitActiveSec=1d
OnBootSec=1m
[Install]
WantedBy=timers.target

@ -0,0 +1,16 @@
[Unit]
Description=fxreader.online-service
Requires=docker.service
After=docker.service
[Service]
#Type=oneshot
ExecStart=/usr/bin/docker compose up --force-recreate --remove-orphans
ExecStop=/usr/bin/docker compose down
WorkingDirectory={{PROJECT_ROOT}}
StandardOutput=null
StandardError=null
Restart=always
[Install]
WantedBy=multi-user.target

@ -43,9 +43,23 @@ def forward(
else:
server_name = 'default_server'
if not server_name in sections:
if (
not server_name in sections
):
sections[server_name] = []
if 'client_max_body_size' in entry:
client_max_body_size = entry['client_max_body_size']
else:
client_max_body_size = '50M'
assert isinstance(client_max_body_size, str)
sections[server_name].append(
r'''
client_max_body_size %s;
''' % client_max_body_size
)
location_get = lambda location_body, location_path2, prefix=None,: (
r'''
@ -138,7 +152,7 @@ server {
server_name {server_name};
listen 80 {default_server};
client_max_body_size 50M;
#client_max_body_size 50M;
{sections_config}
}
@ -199,6 +213,80 @@ def ssl(input_json, output_conf):
servers = []
if 'stream_server' in ssl_nginx:
upstream_servers = []
server_names = []
if 'by_server_name' in ssl_nginx['stream_server']:
for k, v in ssl_nginx['stream_server']['by_server_name'].items():
upstream_servers.append(
'upstream %s { server %s; }' % (
v['upstream_name'],
v['url'],
)
)
server_names.append(
'"%s" %s;' % (
v['server_name'], v['upstream_name'],
)
)
if 'ssh' in ssl_nginx['stream_server']:
ssh_section = 'upstream ssh { server {ssh}; }'.replace(
'{ssh}',
ssl_nginx['stream_server']['ssh'],
)
else:
ssh_section = ''
ssl_port = 444
stream_server = r'''
stream {
upstream web {
server 127.0.0.1:444;
}
{upstream_servers}
{ssh_section}
map $ssl_preread_protocol $upstream_protocol {
default ssh;
"TLSv1.2" $upstream_server_name;
"TLSv1.3" $upstream_server_name;
}
map $ssl_preread_server_name $upstream_server_name {
default web;
{server_names}
}
# SSH and SSL on the same port
server {
listen 443;
ssl_preread on;
proxy_pass $upstream_protocol;
}
}
'''.replace(
'{upstream_servers}', ''.join([
' ' + o + '\n'
for o in upstream_servers
]),
).replace(
'{ssh_section}', ssh_section,
).replace(
'{server_names}', ''.join([
' ' + o + '\n'
for o in server_names
]),
)
else:
stream_server = ''
ssl_port = 443
if 'default_server' in ssl_nginx:
server = ssl_nginx['default_server']
@ -211,7 +299,7 @@ server {
set $t1 $http_x_forwarded_for;
}
listen 443 ssl default_server;
listen {ssl_port} ssl default_server;
server_name _;
client_max_body_size {client_max_body_size};
@ -227,6 +315,8 @@ server {
'{client_max_body_size}', server['client_max_body_size'],
).replace(
'{domain_key}', server['domain_key'],
).replace(
'{ssl_port}', '%d' % ssl_port,
)
)
@ -264,7 +354,7 @@ server {
set $t1 $http_x_forwarded_for;
}
listen 443 ssl;
listen {ssl_port} ssl;
server_name {server_names};
client_max_body_size {client_max_body_size};
@ -291,20 +381,27 @@ server {
'{client_max_body_size}', server['client_max_body_size'],
).replace(
'{domain_key}', server['domain_key'],
).replace(
'{ssl_port}', '%d' % ssl_port,
)
)
with io.open(
output_conf,
'w'
) as f:
f.write(
r'''
load_module "modules/ngx_stream_module.so";
events {
multi_accept on;
worker_connections 64;
}
{stream_server}
http {
log_format main
'[$time_local][$remote_addr:$remote_port, $http_x_forwarded_for, $t1, $http_host]'
@ -325,7 +422,9 @@ http {
'' close;
}
}
'''.replace('{servers}', '\n'.join(servers))
'''\
.replace('{servers}', '\n'.join(servers)) \
.replace('{stream_server}', stream_server)
)

41
d1/systemd.py Normal file

@ -0,0 +1,41 @@
#!/usr/bin/env python3
import os
import pathlib
import io
import glob
import subprocess
import logging
logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.INFO)
cache_path = pathlib.Path.cwd() / 'tmp'
project_root = pathlib.Path.cwd()
logger.info(dict(project_root=project_root, cache_path=cache_path,))
for service in [
pathlib.Path(o) for o in sum([
glob.glob('d1/*.service'),
glob.glob('d1/*.timer')
], [])
]:
os.makedirs(str((cache_path / service).parent), exist_ok=True)
with io.open(str(service), 'r') as f:
with io.open(
str(cache_path / service), 'w'
) as f2:
f2.write(
f.read().replace(
'{{PROJECT_ROOT}}',
str(project_root),
)
)
logger.info(dict(
service=str(service),
msg='updated',
))

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

@ -1,77 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<meta name="viewport" content="width=device-width">
<script
src="https://code.jquery.com/jquery-3.6.0.slim.min.js"
integrity="sha256-u7e5khyithlIdTpu22PHhENmPcRdFiHRjhAuHcs05RI="
crossorigin="anonymous"
></script>
<title>Speech synthesiser</title>
<script>
window.context = {};
window.context.books = [];
</script>
<script src="NoSleep.min.js"></script>
<script src="script.js"></script>
<script src="book.js"></script>
<link rel="stylesheet" href="style.css">
<!--[if lt IE 9]>
<script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script>
<![endif]-->
</head>
<body>
<div class=voice-settings>
<h1>Speech synthesiser</h1>
<p>Enter some text in the input below and press return or the "play" button to hear it. change voices using the dropdown menu.</p>
<form>
<input type="text" class="txt">
<div>
<label for="rate">Rate</label><input type="range" min="0.5" max="2" value="1" step="0.1" id="rate">
<div class="rate-value">1</div>
<div class="clearfix"></div>
</div>
<div>
<label for="pitch">Pitch</label><input type="range" min="0" max="2" value="1" step="0.1" id="pitch">
<div class="pitch-value">1</div>
<div class="clearfix"></div>
</div>
<select class=voice-select>
</select>
<div class="controls">
<button id="play" type="submit">Play</button>
</div>
</form>
</div>
<div class=screen>
<div class=widget>
<select name=book>
<!--<option value=0>Death of a Hear</option>-->
</select>
<br/>
<span>Current Sentence: </span>
<input type=input name=current-sentence></input>
<span>Total Sentences: </span>
<input type=input name=total-sentences disabled>
</input>
<br/>
<input type=button name=add-book value="Add Book">
<input type=button name=read-aloud value="Read Aloud">
<input type=button name=debug value="Debug">
</input>
<br/>
</div>
<pre class=status>
</pre>
</div>
</body>
</html>

@ -1,508 +0,0 @@
$(window).on('load', () => {
var synth = window.speechSynthesis;
var inputForm = document.querySelector('form');
var inputTxt = document.querySelector('.txt');
var voiceSelect = document.querySelector('select');
var pitch = document.querySelector('#pitch');
var pitchValue = document.querySelector('.pitch-value');
var rate = document.querySelector('#rate');
var rateValue = document.querySelector('.rate-value');
var voices = [];
context.nosleep_timer = null;
context.ui = {
voice_settings_div: $('.voice-settings'),
voice_select: $('.voice-select'),
status_pre: $('.status'),
books_select: $('.screen .widget select[name=book]'),
current_sentence_input:
$('.screen .widget input[name=current-sentence]'),
total_sentences_input:
$('.screen .widget input[name=total-sentences]'),
read_aloud:
$('.screen .widget input[name=read-aloud]'),
add_book:
$('.screen .widget input[name=add-book]'),
debug:
$('.screen .widget input[name=debug]'),
};
context.update_books = () => {
context.ui.books_select.empty();
window.context.books.map(
(o, i) => $('<option>').attr('value', '' + i).text(o.slice(0, 10))
).forEach((o) => context.ui.books_select.append(o))
}
context.update_books();
context.sentences = null;
context.pending_stop = false;
context.current_book = null;
context.nosleep = new NoSleep();
context.is_debug = false;
context.log = {
error: [],
info: [],
};
context.callbacks = {
log_error: (msg) => {
if (context.is_debug)
{
console.error(msg);
context.log.error.push(msg);
}
},
enable_no_sleep: () => {
if (context.nosleep_timer != null)
{
context.callbacks.log_error('running already');
}
context.nosleep_timer = setInterval(
() => {
location.hash = 'nosleep' + Math.random();
context.callbacks.update_status();
/*
if ('vibrate' in window.navigator)
{
window.navigator.vibrate(200);
}
*/
}, 1000
);
},
get_state: () => {
let t1 = localStorage['state'];
if (t1)
{
return JSON.parse(t1);
}
else
{
return {};
}
},
get_cookie: (key) => {
/*
return document.cookie.split('; ').map(
(o) => o.split('=')
).reduce(
(b, a) => {
if (a.length == 2) {b[a[0]] = a[1]};
return b
},
{}
)[key];
*/
let t1 = localStorage['state'];
if (t1 != undefined)
{
let t2 = JSON.parse(t1);
return t2[key];
}
else
{
return undefined;
}
},
set_cookie: (key, value) => {
let state = context.callbacks.get_state('state');
state[key] = value;
//document.cookie = `${key}=${value};`;
localStorage['state'] = JSON.stringify(state);
context.callbacks.update_status();
},
disable_no_sleep: () => {
if (context.nosleep_timer == null)
{
context.callbacks.log_error('nothing is running');
}
clearInterval(context.nosleep_timer);
location.hash = '';
context.nosleep_timer = null;
synth.cancel();
},
continuous_reading: async() => {
if (context.is_reading)
{
context.pending_stop = true;
return;
}
context.is_reading = true;
context.nosleep.enable();
context.callbacks.enable_no_sleep();
context.ui.voice_settings_div.addClass('hidden');
context.ui.current_sentence_input.attr(
'disabled',
'disabled'
);
while (
context.callbacks.get_cookie('sentence_id') < context.sentences.length &&
!context.pending_stop
)
{
let sentence =
context.sentences[context.callbacks.get_cookie('sentence_id')];
//context.callbacks.log_error('start');
try {
await context.read_aloud(
context.sentences[
context.callbacks.get_cookie('sentence_id')
]
);
} catch (e) {
context.callbacks.log_error(e);
}
//context.callbacks.log_error('finished');
if (!context.pending_stop)
{
context.callbacks.set_cookie(
'sentence_id',
context.callbacks.get_cookie('sentence_id') + 1
);
}
}
context.pending_stop = false;
context.ui.current_sentence_input.removeAttr('disabled');
context.nosleep.disable();
context.ui.voice_settings_div.removeClass('hidden');
context.callbacks.disable_no_sleep();
context.is_reading = false;
},
update_status: () => {
let data = {};
data.state = context.callbacks.get_state();
if (
context.callbacks.get_cookie('sentence_id') != null &&
context.sentences != null &&
context.callbacks.get_cookie('sentence_id') < context.sentences.length
)
{
data.sentence = context.sentences[context.callbacks.get_cookie('sentence_id')];
}
data.pending_stop = context.pending_stop;
data.is_reading = context.is_reading;
data.log = context.log;
context.ui.current_sentence_input.val(
context.callbacks.get_cookie('sentence_id')
);
data.timestamp = (new Date());
data.version = 'v0.1.7';
data.speech_synthesis = {
paused: synth.paused,
pending: synth.pending,
speaking: synth.speaking,
};
/*
if (!synth.speaking && context.is_reading)
{
synth.cancel();
}
*/
context.ui.status_pre.text(
JSON.stringify(
data,
null,
4,
)
);
},
ui_read_aloud_on_click: async() => {
let book_id = parseInt(context.ui.books_select.val());
if (context.current_book != book_id)
{
context.current_book = book_id;
context.sentences =
context.books[
context.current_book
].replaceAll(/([\.\?\!])\s+/g,'$1\n')
.split('\n');
context.ui.total_sentences_input.val(
context.sentences.length,
);
{
let state = context.callbacks.get_state();
}
}
if (
context.ui.current_sentence_input.val() != ''
)
{
try{
let sentence_id = parseInt(
context.ui.current_sentence_input.val()
);
if (
sentence_id >= 0 &&
sentence_id < context.sentences.length
)
{
context.callbacks.set_cookie(
'sentence_id',
sentence_id
);
}
} catch (e) {
context.callbacks.log_error(e);
}
}
if (context.is_reading && !context.pending_stop)
{
context.pending_stop = true;
}
else
{
context.callbacks.continuous_reading();
}
},
populateVoiceList: () => {
voices = synth.getVoices().sort(function (a, b) {
const aname = a.name.toUpperCase(), bname = b.name.toUpperCase();
if ( aname < bname ) return -1;
else if ( aname == bname ) return 0;
else return +1;
});
//var selectedIndex = voiceSelect.selectedIndex < 0 ? 0 : voiceSelect.selectedIndex;
voiceSelect.innerHTML = '';
for(i = 0; i < voices.length ; i++) {
var option = document.createElement('option');
option.textContent = voices[i].name + ' (' + voices[i].lang + ')';
if(voices[i].default) {
option.textContent += ' -- DEFAULT';
}
{
let voice = context.callbacks.get_cookie('voice');
if (voice && option.textContent == voice)
{
$(option).attr('selected', 'selected');
}
}
option.setAttribute('data-lang', voices[i].lang);
option.setAttribute('data-name', voices[i].name);
voiceSelect.appendChild(option);
}
//voiceSelect.selectedIndex = selectedIndex;
},
init: () => {
let state = context.callbacks.get_state();
context.ui.voice_select.val(state.voice);
if (!state.book_id)
{
context.callbacks.set_cookie(
'book_id',
0,
);
}
if (!state.sentence_id)
{
context.callbacks.set_cookie(
'sentence_id',
0,
);
}
if (state.book_id)
{
context.ui.books_select.find(
'>option',
).eq(state.book_id).attr('selected', 'selected');
}
if (state.sentence_id)
{
context.ui.current_sentence_input.val(
state.sentence_id,
);
}
},
};
context.callbacks.populateVoiceList();
if (speechSynthesis.onvoiceschanged !== undefined) {
speechSynthesis.onvoiceschanged = context.callbacks.populateVoiceList;
}
context.callbacks.init();
context.ui.add_book.on(
'click',
async () => {
alert('fuck');
let book = await (
(await fetch(
'books/' + prompt('enter book file', '1.txt')
)).text()
);
//let book = prompt('enter text', '');
//let title = prompt('enter title', '');
//window.context.books.push(title + '\n' + book);
window.context.books.push(book);
window.context.update_books();
},
);
context.ui.read_aloud.on(
'click',
context.callbacks.ui_read_aloud_on_click,
);
context.ui.voice_select.on(
'change',
() => {
context.callbacks.set_cookie(
'voice',
context.ui.voice_select.val()
);
}
);
context.ui.debug.on(
'click',
() => {
if (context.is_debug)
{
context.is_debug = false;
}
else
{
context.is_debug = true;
}
context.callbacks.update_status();
}
);
context.read_aloud = async (raw_line) => {
line = raw_line.trim();
if (line.length == 0)
{
return;
}
let sleep_detect = null;
let exit = () => {
if (sleep_detect != null)
{
clearInterval(sleep_detect);
}
}
return new Promise((response, reject) => {
if (synth.speaking) {
context.callbacks.log_error('speechSynthesis.speaking');
if (reject != undefined)
{
reject('error');
}
return;
}
let utterThis = new SpeechSynthesisUtterance(line);
utterThis.onend = function (event) {
exit();
context.callbacks.log_error(
'SpeechSynthesisUtterance.onend ' + event.error
);
if (response != undefined)
{
response('done ' + event.error);
}
}
utterThis.onpause = function (event) {
exit();
context.callbacks.log_error('SpeechSynthesisUtterance.onpause');
if (reject != undefined)
{
reject('paused ' + event.error);
}
}
utterThis.onerror = function (event) {
exit();
context.callbacks.log_error(
'SpeechSynthesisUtterance.onerror ' + event.error
);
if (reject != undefined)
{
reject('error ' + event.error);
}
}
let selectedOption = voiceSelect.selectedOptions[0].getAttribute('data-name');
for(i = 0; i < voices.length ; i++) {
if(voices[i].name === selectedOption) {
utterThis.voice = voices[i];
break;
}
}
//window.alert('fuck3');
utterThis.pitch = pitch.value;
utterThis.rate = rate.value;
synth.speak(utterThis);
let silence_count = 0;
sleep_detect = setInterval(
() => {
if (!synth.speaking)
{
context.callbacks.log_error(
'silence count is ' + silence_count
)
++silence_count;
}
if (silence_count == 3 || context.pending_stop)
{
exit();
if (context.pending_stop)
{
synth.cancel();
reject('pending stop');
}
else
{
context.callbacks.log_error('phone is sleeping, retry');
response('utterance is not present');
}
/*
context.read_aloud(
line
).then(response).catch(reject);
*/
}
},
100,
);
});
}
function speak(){
let line = inputTxt.value;
if (line !== '') {
context.read_aloud(line);
}
}
inputForm.onsubmit = function(event) {
event.preventDefault();
speak();
inputTxt.blur();
}
pitch.onchange = function() {
pitchValue.textContent = pitch.value;
}
rate.onchange = function() {
rateValue.textContent = rate.value;
}
voiceSelect.onchange = function(){
speak();
}
});

@ -1,84 +0,0 @@
body, html {
margin: 0;
}
html {
height: 100%;
}
body {
height: 90%;
max-width: 800px;
margin: 0 auto;
}
h1, p {
font-family: sans-serif;
text-align: center;
padding: 20px;
}
.txt, select, form > div {
display: block;
margin: 0 auto;
font-family: sans-serif;
font-size: 16px;
padding: 5px;
}
.txt {
width: 80%;
}
select {
width: 83%;
}
form > div {
width: 81%;
}
.txt, form > div {
margin-bottom: 10px;
overflow: auto;
}
.clearfix {
clear: both;
}
label {
float: left;
width: 10%;
line-height: 1.5;
}
.rate-value, .pitch-value {
float: right;
width: 5%;
line-height: 1.5;
}
#rate, #pitch {
float: right;
width: 81%;
}
.controls {
text-align: center;
margin-top: 10px;
}
.controls button {
padding: 10px;
}
.hidden
{
display: none !important;
}
pre {
word-break: break-all;
white-space: pre-wrap;
}

1
deps/com.github.aiortc.aiortc vendored Submodule

@ -0,0 +1 @@
Subproject commit adef10a8c41f5c550622879370a40f8a9e545574

10
deps/greasyfork/.editorconfig vendored Normal file

@ -0,0 +1,10 @@
root = true
[*]
end_of_line = lf
insert_final_newline = true
[*.{js,json,yml}]
charset = utf-8
indent_style = space
indent_size = 2

6
deps/greasyfork/.gitattributes vendored Normal file

@ -0,0 +1,6 @@
/.yarn/** linguist-vendored
/.yarn/releases/* binary
/.yarn/plugins/**/* binary
/.pnp.* binary linguist-generated
/dist/** filter=lfs diff=lfs merge=lfs -text

15
deps/greasyfork/.gitignore vendored Normal file

@ -0,0 +1,15 @@
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/sdks
!.yarn/versions
!dist
build
# Swap the comments on the following lines if you wish to use zero-installs
# In that case, don't forget to run `yarn config set enableGlobalCache false`!
# Documentation here: https://yarnpkg.com/features/caching#zero-installs
#!.yarn/cache
.pnp.*

1
deps/greasyfork/README.md vendored Normal file

@ -0,0 +1 @@
# greasyfork

9486
deps/greasyfork/dist/linkedin.user.js vendored Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

32
deps/greasyfork/package.json vendored Normal file

@ -0,0 +1,32 @@
{
"name": "greasyfork",
"packageManager": "yarn@4.4.0",
"dependencies": {
"@babel/core": "latest",
"@babel/runtime": "latest",
"@gera2ld/plaid-rollup": "latest",
"@violentmonkey/dom": "latest",
"@violentmonkey/ui": "latest",
"jquery": "latest",
"solid-js": "latest",
"typescript": "latest",
"vite": "latest"
},
"devDependencies": {
"@babel/plugin-transform-react-jsx": "latest",
"@babel/plugin-transform-runtime": "latest",
"@rollup/plugin-typescript": "latest",
"@types/babel__core": "latest",
"@types/babel__plugin-transform-runtime": "latest",
"@types/jquery": "latest",
"@violentmonkey/types": "latest",
"cross-env": "latest",
"postcss": "latest",
"prettier": "latest",
"rollup": "latest",
"rollup-plugin-postcss": "latest",
"rollup-plugin-userscript": "latest",
"tslib": "latest",
"unocss": "latest"
}
}

48
deps/greasyfork/rollup.config.mjs vendored Normal file

@ -0,0 +1,48 @@
import { defineExternal, definePlugins } from '@gera2ld/plaid-rollup';
import { defineConfig } from 'rollup';
import userscript from 'rollup-plugin-userscript';
import typescript from '@rollup/plugin-typescript';
import pkg from './package.json' with { type: 'json' };
export default defineConfig(
Object.entries({
'linkedin': 'src/linkedin/index.ts',
}).map(([name, entry]) => ({
input: entry,
plugins: [
...definePlugins({
esm: true,
minimize: false,
postcss: {
inject: false,
minimize: true,
},
extensions: ['.ts', '.tsx', '.mjs', '.js', '.jsx'],
}),
userscript((meta) => meta.replace('process.env.AUTHOR', pkg.author)),
typescript({ sourceMap: true, inlineSources: true }),
],
external: defineExternal([
'@violentmonkey/ui',
//'@violentmonkey/dom',
'solid-js',
'solid-js/web',
]),
output: {
sourcemap: true,
sourcemapBaseUrl: 'https://gitea.fxreader.online/fxreader.online/freelance-project-34-marketing-blog/media/branch/master/deps/greasyfork/dist/',
format: 'iife',
file: `dist/${name}.user.js`,
globals: {
// Note:
// - VM.solid is just a third-party UMD bundle for solid-js since there is no official one
// - If you don't want to use it, just remove `solid-js` related packages from `external`, `globals` and the `meta.js` file.
'solid-js': 'VM.solid',
'solid-js/web': 'VM.solid.web',
//'@violentmonkey/dom': 'VM',
'@violentmonkey/ui': 'VM',
},
indent: false,
},
})),
);

636
deps/greasyfork/src/linkedin/index.ts vendored Normal file

@ -0,0 +1,636 @@
// ==UserScript==
// @name data extraction linkedin
// @namespace Violentmonkey Scripts
// @match https://www.linkedin.com/*
// @grant GM_getValue
// @grant GM_setValue
// @grant GM_getValues
// @grant GM_setValues
// @grant GM_listValues
// @grant GM_deleteValue
// @grant GM_deleteValues
// @grant GM_addStyle
// @grant GM_addElement
// @version 0.1
// @author Siarhei Siniak
// @license Unlicense
// @description 10/08/2024, 8:44:59 PM
// @run-at document-body
// @inject-into content
// @noframes
// ==/UserScript==
/*
Use this extension to disalbe CSP for linkedin
https://addons.mozilla.org/en-US/firefox/addon/header-editor/
https://github.com/FirefoxBar/HeaderEditor
https://github.com/violentmonkey/violentmonkey/issues/1335
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/script-src
{
"request": [],
"sendHeader": [],
"receiveHeader": [
{
"enable": true,
"name": "disable CSP for linkedin",
"ruleType": "modifyReceiveHeader",
"matchType": "domain",
"pattern": "www.linkedin.com",
"exclude": "",
"group": "Ungrouped",
"isFunction": false,
"action": {
"name": "content-security-policy",
"value": ""
}
}
],
"receiveBody": []
}
*/
import $ from "jquery";
import * as VM from "@violentmonkey/dom";
interface Entry {
header: string
teaser?: string
};
interface State {
search: string
};
class Linkedin {
data : Map<string, any>;
is_fullscreen: boolean = false;
ui : {
root: any | null
entries: any | null
search: any | null
state: any | null
};
state : State;
old_state: State | null = null;
constructor() {
this.data = new Map();
this.ui = {
root: null,
entries: null,
search: null,
state: null,
};
this.state = {
search: '',
};
}
clean_page() {
if (location.href.search('empty_body=true') != -1)
{
this.is_fullscreen = true;
$('head').empty();
$('body').empty();
$('body').addClass('no-border');
}
}
async data_load() {
let self = this;
const keys = await GM_listValues();
let loaded = 0;
for (let o of keys)
{
if (!o.startsWith('data-'))
{
return;
}
self.data.set(
o.slice(5,),
await GM_getValue(o)
);
loaded += 1;
}
console.log({action: 'loaded', total: loaded});
}
string_reduce (text: string) {
return text.replaceAll(/\s+/gi, ' ').trim();
}
parse_header() {
let self = this;
return [
$(
'.scaffold-finite-scroll__content > div > .relative .update-components-header'
).map((i, o) => ({
header: o.innerText
})),
$(
'.scaffold-finite-scroll__content > div > .relative .update-components-actor'
).map((i, o) => {
let header = $(o);
let teaser = $(o).parents('.relative')
.parent().find('.feed-shared-update-v2__description-wrapper');
return {
header: self.string_reduce(header.text()),
teaser: self.string_reduce(teaser.text()),
};
})
]
}
async data_add (entry: Entry) {
let self = this;
if (self.data.has(entry.header))
{
return false;
}
self.data.set(entry.header, {
entry: entry,
ts: (new Date()).valueOf(),
});
await GM_setValue(
'data-' + entry.header,
self.data.get(entry.header)
)
console.log('saved ' + entry.header);
console.log(self.data.get(entry.header));
return true;
}
async document_on_changed () {
let self = this;
let state_changed = false;
if (
JSON.stringify(self.state_get()) != JSON.stringify(self.state)
)
{
state_changed = true;
self.old_state = self.state;
self.state = self.state_get();
}
let current_data = self.parse_header();
let changed = false;
for (let o of current_data[0])
{
let current_changed = await self.data_add(o);
if (current_changed)
{
changed = current_changed;
}
}
for (let o of current_data[1])
{
let current_changed = await self.data_add(o);
if (current_changed)
{
changed = current_changed;
}
}
if (
changed || (
state_changed ||
self.ui.entries === null && self.data.size > 0
)
)
{
self.display();
}
}
listener_add() {
let self = this;
return VM.observe(
document.body,
() => {
self.document_on_changed();
}
);
}
display_init() {
let self = this;
self.ui.root = $(`<div class=online-fxreader-linkedin>`);
$(document.body).append(self.ui.root);
if (self.is_fullscreen)
{
self.ui.root.addClass('fullscreen');
}
$('head').append($('<style>').html(`
div.online-fxreader-linkedin {
height: 10em;
overflow: hidden;
z-index: 9999;
position: fixed;
top: 5em;
background: yellow;
margin-left: 1em;
word-wrap: anywhere;
white-space: break-spaces;
margin-right: 1em;
width: calc(100% - 2em);
}
.d-none {
display: none !important;
}
.online-fxreader-linkedin.tray-active .search,
.online-fxreader-linkedin.tray-active .entries
{
display: none;
}
.online-fxreader-linkedin .tray
{
cursor: pointer;
position: absolute;
right: 0px;
z-index: 9999;
}
.online-fxreader-linkedin.tray-active
{
right: 1em;
width: 3em;
height: 3em !important;
}
.online-fxreader-linkedin .search
{
display: flex;
position: sticky;
top: 0px;
background-color: #eee;
}
.online-fxreader-linkedin .search input
{
width: 60em;
}
.online-fxreader-linkedin .entries
{
overflow: scroll;
height: 100%;
}
.online-fxreader-linkedin .entry.even
{
background-color: #eee;
}
.online-fxreader-linkedin .entry.odd
{
background-color: #ddd;
}
.online-fxreader-linkedin .search,
.online-fxreader-linkedin .search input
{
height: 2em;
line-height: 2em;
box-sizing: border-box;
}
.no-border {
padding: unset;
margin: unset;
}
.online-fxreader-linkedin:hover,
.online-fxreader-linkedin.fullscreen
{
height: 80vh;
}
`));
GM_addElement('script', {
"textContent": `
class Linkedin {
constructor() {
let self = this;
this.has_callbacks = false;
this.ui = {
root: () => {
return document.getElementsByClassName('online-fxreader-linkedin')[0];
},
};
self.ui.search = () => {
let search = self.ui.root().getElementsByClassName('search')[0];
let search_input = search.getElementsByTagName('input')[0];
return search_input;
};
self.ui.tray = () => {
// let search = self.ui.root().getElementsByClassName('search')[0];
let tray = self.ui.root().getElementsByClassName('tray')[0];
return tray;
};
self.ui.state = () => {
let state = self.ui.root().getElementsByClassName('state')[0];
return state;
};
}
add_callbacks() {
let self = this;
self.ui.tray().addEventListener(
'click', function(e) {
let o = e.currentTarget;
let cl = o.classList;
let r = self.ui.root();
if (cl.contains('active'))
{
cl.remove('active');
r.classList.add('tray-active');
}
else
{
cl.add('active');
r.classList.remove('tray-active');
}
}
);
}
blah(class_name) {
if (!this.has_callbacks)
{
this.add_callbacks();
this.has_callbacks = true;
}
console.log('blah');
Array.from(
document.getElementsByClassName(class_name)
).forEach((o) => o.remove());
}
state_update(partial) {
let self = this;
let ui_state = self.ui.state();
let old_state = JSON.parse(ui_state.innerText);
ui_state.innerText = JSON.stringify(
{
...old_state,
...partial
}
);
}
search_on_change() {
let self = this;
let search = self.ui.search();
self.state_update(
{
search: search.value
}
);
}
};
const online_fxreader_linkedin = new Linkedin();
console.log('started');
`
});
}
state_get() {
let self = this;
if (self.ui.state && self.ui.state.text() !== '')
{
return JSON.parse(self.ui.state.text());
}
else
{
return {};
}
}
state_set(partial: any) {
let self = this;
self.ui.state.text(
{
...self.state_get(),
...partial
}
);
}
display() {
let self = this;
let sorted_entries = Array.from(self.data.entries()).sort(
(a, b) => a[1].ts - b[1].ts
);
// self.ui.root.empty();
if (self.ui.search === null)
{
self.ui.root.append(
$('<div>').addClass('tray').text('SHOW/HIDE')
);
let search = $('<div>').addClass('search').append(
$('<input>').val(self.state.search)
).attr(
'onkeyup',
`online_fxreader_linkedin.search_on_change()`,
);
search.append(
$('<div>').addClass('total')
);
self.ui.root.append(search);
self.ui.search = search;
}
if (self.ui.state === null)
{
self.ui.state = $('<div>').addClass('state d-none').text(
JSON.stringify(self.state)
);
self.ui.root.append(self.ui.state);
}
else
{
}
//state_set(old_state);
let entries = null;
if (self.ui.entries === null)
{
entries = $('<div>').addClass('entries');
self.ui.root.append(entries);
self.ui.entries = entries
}
else
{
entries = self.ui.entries;
entries.empty();
}
let keywords = (self.state?.search || '').split(/\s+/).map((o) => {
let action = '';
let word = '';
if (o.length > 0)
{
if (o[0] == '+')
{
action = 'include';
word = o.slice(1,);
}
else if (o[0] == '-')
{
action = 'exclude';
word = o.slice(1,);
}
else
{
action = 'include';
word = o;
}
}
return {
action,
word,
};
}).filter((o) => o.action !== '' && o.word !== '');
let filtered_entries = sorted_entries.filter((o) => {
let match = true;
let text = JSON.stringify(o);
for (let k of keywords)
{
if (k.action == 'include')
{
if (text.search(k.word) == -1)
{
match = false;
}
}
else if (k.action == 'exclude')
{
if (text.search(k.word) != -1)
{
match = false;
}
}
if (!match)
{
break;
}
}
return match;
})
self.ui.search.find('.total').text(
filtered_entries.length
);
let i = 0;
for (let o of filtered_entries.reverse())
{
let raw = JSON.stringify(o[1]);
let ts = (new Date(o[1].ts));
let entry = $('<div>').addClass('entry');
if (i % 2 == 0)
{
entry.addClass('even');
}
else
{
entry.addClass('odd');
}
entry.append(
$('<div>').addClass('ts').text(
ts.toISOString(),
)
);
entry.append(
$('<div>').addClass('header').text(
o[1].entry.header
)
);
entry.append(
$('<div>').addClass('teaser').text(
o[1].entry.teaser
)
);
// entry.append($('<pre>').text(raw));
entries.append(entry);
++i;
}
GM_addElement('script', {
"class": 'bridge',
"textContent": `
online_fxreader_linkedin.blah('bridge');
`
});
}
}
const l = new Linkedin();
(async () => {
l.clean_page();
await l.data_load();
const disconnect = l.listener_add();
l.display_init();
})();

1
deps/greasyfork/src/types/vm.d.ts vendored Normal file

@ -0,0 +1 @@
import '@violentmonkey/types';

115
deps/greasyfork/tsconfig.json vendored Normal file

@ -0,0 +1,115 @@
{
"include": ["src"],
"compilerOptions": {
/* Visit https://aka.ms/tsconfig to read more about this file */
/* Projects */
// "incremental": true, /* Save .tsbuildinfo files to allow for incremental compilation of projects. */
// "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */
// "tsBuildInfoFile": "./.tsbuildinfo", /* Specify the path to .tsbuildinfo incremental compilation file. */
// "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects. */
// "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */
// "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */
/* Language and Environment */
"target": "esnext", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
"lib": [
"DOM",
"ES6",
"DOM.Iterable",
"ScriptHost",
"ESNext"
], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
// "jsx": "preserve", /* Specify what JSX code is generated. */
// "experimentalDecorators": true, /* Enable experimental support for legacy experimental decorators. */
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */
// "reactNamespace": "", /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
// "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
/* Modules */
"module": "esnext", /* Specify what module code is generated. */
// "rootDir": "./", /* Specify the root folder within your source files. */
"moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
// "typeRoots": [], /* Specify multiple folders that act like './node_modules/@types'. */
// "types": [], /* Specify type package names to be included without being referenced in a source file. */
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
// "moduleSuffixes": [], /* List of file name suffixes to search when resolving a module. */
// "allowImportingTsExtensions": true, /* Allow imports to include TypeScript file extensions. Requires '--moduleResolution bundler' and either '--noEmit' or '--emitDeclarationOnly' to be set. */
// "resolvePackageJsonExports": true, /* Use the package.json 'exports' field when resolving package imports. */
// "resolvePackageJsonImports": true, /* Use the package.json 'imports' field when resolving imports. */
// "customConditions": [], /* Conditions to set in addition to the resolver-specific defaults when resolving imports. */
// "resolveJsonModule": true, /* Enable importing .json files. */
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
// "noResolve": true, /* Disallow 'import's, 'require's or '<reference>'s from expanding the number of files TypeScript should add to a project. */
/* JavaScript Support */
// "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
// "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
// "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
/* Emit */
// "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
"sourceMap": true, /* Create source map files for emitted JavaScript files. */
// "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */
"outDir": "build/", /* Specify an output folder for all emitted files. */
// "removeComments": true, /* Disable emitting comments. */
// "noEmit": true, /* Disable emitting files from a compilation. */
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
// "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
// "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
// "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */
// "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
// "newLine": "crlf", /* Set the newline character for emitting files. */
// "stripInternal": true, /* Disable emitting declarations that have '@internal' in their JSDoc comments. */
// "noEmitHelpers": true, /* Disable generating custom helper functions like '__extends' in compiled output. */
// "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */
// "preserveConstEnums": true, /* Disable erasing 'const enum' declarations in generated code. */
// "declarationDir": "./", /* Specify the output directory for generated declaration files. */
/* Interop Constraints */
// "isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */
// "verbatimModuleSyntax": true, /* Do not transform or elide any imports or exports not marked as type-only, ensuring they are written in the output file's format based on the 'module' setting. */
// "isolatedDeclarations": true, /* Require sufficient annotation on exports so other tools can trivially generate declaration files. */
// "allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
// "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
/* Type Checking */
"strict": true, /* Enable all strict type-checking options. */
// "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied 'any' type. */
// "strictNullChecks": true, /* When type checking, take into account 'null' and 'undefined'. */
// "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
// "strictBindCallApply": true, /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */
// "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
// "noImplicitThis": true, /* Enable error reporting when 'this' is given the type 'any'. */
// "useUnknownInCatchVariables": true, /* Default catch clause variables as 'unknown' instead of 'any'. */
// "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
// "noUnusedLocals": true, /* Enable error reporting when local variables aren't read. */
// "noUnusedParameters": true, /* Raise an error when a function parameter isn't read. */
// "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
// "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
// "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
// "noUncheckedIndexedAccess": true, /* Add 'undefined' to a type when accessed using an index. */
// "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
// "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type. */
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */
/* Completeness */
// "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */
"skipLibCheck": true /* Skip type checking all .d.ts files. */
}
}

4844
deps/greasyfork/yarn.lock vendored Normal file

File diff suppressed because it is too large Load Diff

1
deps/online.fxreader.nartes.books vendored Submodule

@ -0,0 +1 @@
Subproject commit 3c691ef68d8899edf328d5b06135c0d3b02e7940

@ -7,7 +7,7 @@ services:
volumes:
- ./d1/:/app/d1/:ro
- ./tmp/cache/:/app/tmp/cache/:ro
restart: always
restart: on-failure
ssl-app:
build:
context: .
@ -16,36 +16,40 @@ services:
- ./d1/:/app/d1/:ro
- ./tmp/d1/:/app/tmp/d1/:ro
- ./tmp/d1/letsencrypt:/etc/letsencrypt:rw
restart: always
restart: on-failure
cpanel:
build:
context: .
dockerfile: ./docker/cpanel/Dockerfile
links:
- app
#links:
# - app
volumes:
- ./d1/:/app/d1:ro
- ./tmp/d1/:/app/tmp/d1/:ro
restart: always
restart: on-failure
dynu:
build:
context: .
dockerfile: ./docker/dynu/Dockerfile
profiles:
- broken
volumes:
- ./d1/dynu_update.py:/app/d1/dynu_update.py:ro
- ./tmp/cache/dynu.auth.json:/app/tmp/cache/dynu.auth.json:ro
restart: always
links:
- ngrok
restart: on-failure
# links:
# - ngrok
ngrok:
image: wernight/ngrok
links:
- app
#links:
# - app
profiles:
- broken
command: ['ngrok', 'http', 'app:80']
volumes:
- ./tmp/cache/ngrok.yml:/home/ngrok/.ngrok2/ngrok.yml:ro
restart: always
restart: on-failure
#forward:
# build:
# context: .

@ -4,7 +4,7 @@ RUN apk add python3
RUN apk add tini
RUN apk add bash curl
RUN apk add py3-pip
RUN pip3 install requests
RUN pip3 install --break-system-packages requests
WORKDIR /app

19
docker/js/.zshrc Normal file

@ -0,0 +1,19 @@
# The following lines were added by compinstall
zstyle ':completion:*' completer _expand _complete _ignored _correct _approximate
zstyle :compinstall filename '~/.zshrc'
setopt INC_APPEND_HISTORY SHARE_HISTORY AUTO_PUSHD PUSHD_IGNORE_DUPS
setopt PROMPTSUBST
autoload -Uz compinit
compinit
# End of lines added by compinstall
# Lines configured by zsh-newuser-install
HISTFILE=~/.histfile
HISTSIZE=1000000
SAVEHIST=1000000
# End of lines configured by zsh-newuser-install
bindkey -d
bindkey -v

15
docker/js/Dockerfile Normal file

@ -0,0 +1,15 @@
FROM node as base
ENV DEBIAN_FRONTEND noninteractive
RUN \
apt-get update -yy && \
apt-get install \
tini zsh less tree \
-yy
RUN chsh -s /usr/bin/zsh
WORKDIR /app/deps/greasyfork
ENTRYPOINT ["tini", "--"]
CMD ["bash", "/app/docker/js/init.sh"]
# CMD ["sleep", "999999999999999999"]

@ -0,0 +1,17 @@
version: '3.7'
services:
js:
build:
context: .
dockerfile: ./docker/js/Dockerfile
volumes:
- ./deps/greasyfork:/app/deps/greasyfork:rw
- ./tmp/cache/js/root-cache:/root/.cache:rw
- ./tmp/cache/js/root-yarn:/root/.yarn:rw
- ./docker/js:/app/docker/js:ro
- ./tmp/cache/js:/app/tmp/cache/js:rw
deploy:
resources:
limits:
cpus: 1.5
memory: 1G

7
docker/js/init.sh Normal file

@ -0,0 +1,7 @@
corepack enable
corepack install
# yarn init -2
ln -sf /app/docker/js/.zshrc ~/.zshrc
ln -sf /app/tmp/cache/js/.histfile ~/.histfile
export EDITOR=vim
exec /usr/bin/zsh -l

@ -7,6 +7,7 @@ RUN apk add nginx
RUN apk add tini
#RUN pip3 install requests certbot
RUN apk add certbot
RUN apk add nginx-mod-stream
WORKDIR /app

@ -0,0 +1,156 @@
// Place your key bindings in this file to override the defaults
[
{
"key": "alt+z",
"command": "-editor.action.toggleWordWrap"
},
{
"key": "alt+z",
"command": "-workbench.action.terminal.sizeToContentWidth",
"when": "terminalFocus && terminalHasBeenCreated && terminalIsOpen || terminalFocus && terminalIsOpen && terminalProcessSupported"
},
{
"key": "alt+r",
"command": "workbench.action.toggleMaximizeEditorGroup",
"when": "editorPartMaximizedEditorGroup || editorPartMultipleEditorGroups"
},
{
"key": "ctrl+k ctrl+m",
"command": "-workbench.action.toggleMaximizeEditorGroup",
"when": "editorPartMaximizedEditorGroup || editorPartMultipleEditorGroups"
},
{
"key": "alt+r",
"command": "workbench.action.toggleMaximizedPanel",
"when": "!editorTextFocus"
},
{
"key": "ctrl+p",
"command": "-extension.vim_ctrl+p",
"when": "editorTextFocus && vim.active && vim.use<C-p> && !inDebugRepl || vim.active && vim.use<C-p> && !inDebugRepl && vim.mode == 'CommandlineInProgress' || vim.active && vim.use<C-p> && !inDebugRepl && vim.mode == 'SearchInProgressMode'"
},
{
"key": "alt+t",
"command": "workbench.action.terminal.toggleTerminal",
"when": "terminal.active"
},
{
"key": "ctrl+`",
"command": "-workbench.action.terminal.toggleTerminal",
"when": "terminal.active"
},
{
"key": "ctrl+e",
"command": "-workbench.action.quickOpen"
},
{
"key": "ctrl+n",
"command": "-extension.vim_ctrl+n",
"when": "editorTextFocus && vim.active && vim.use<C-n> && !inDebugRepl || vim.active && vim.use<C-n> && !inDebugRepl && vim.mode == 'CommandlineInProgress' || vim.active && vim.use<C-n> && !inDebugRepl && vim.mode == 'SearchInProgressMode'"
},
{
"key": "ctrl+t",
"command": "-extension.vim_ctrl+t",
"when": "editorTextFocus && vim.active && vim.use<C-t> && !inDebugRepl"
},
{
"key": "ctrl+f",
"command": "-extension.vim_ctrl+f",
"when": "editorTextFocus && vim.active && vim.use<C-f> && !inDebugRepl && vim.mode != 'Insert'"
},
{
"key": "ctrl+f",
"command": "-actions.find",
"when": "editorFocus || editorIsOpen"
},
{
"key": "ctrl+f",
"command": "workbench.action.findInFiles"
},
{
"key": "ctrl+shift+f",
"command": "-workbench.action.findInFiles"
},
{
"key": "ctrl+g",
"command": "-workbench.action.gotoLine"
},
{
"key": "ctrl+g",
"command": "-workbench.action.terminal.goToRecentDirectory",
"when": "terminalFocus && terminalHasBeenCreated || terminalFocus && terminalProcessSupported"
},
{
"key": "alt+r",
"command": "-toggleSearchRegex",
"when": "searchViewletFocus"
},
{
"key": "alt+r",
"command": "-toggleFindRegex",
"when": "editorFocus"
},
{
"key": "alt+r",
"command": "-workbench.action.terminal.toggleFindRegex",
"when": "terminalFindVisible && terminalHasBeenCreated || terminalFindVisible && terminalProcessSupported"
},
{
"key": "alt+r",
"command": "-toggleSearchEditorRegex",
"when": "inSearchEditor && searchInputBoxFocus"
},
{
"key": "ctrl+/",
"command": "-editor.action.accessibleViewAcceptInlineCompletion",
"when": "accessibleViewIsShown && accessibleViewCurrentProviderId == 'inlineCompletions'"
},
{
"key": "ctrl+k ctrl+/",
"command": "-editor.foldAllBlockComments",
"when": "editorTextFocus && foldingEnabled"
},
{
"key": "ctrl+/",
"command": "-toggleExplainMode",
"when": "suggestWidgetVisible"
},
{
"key": "ctrl+/",
"command": "-workbench.action.chat.attachContext",
"when": "inChatInput && chatLocation == 'editing-session' || inChatInput && chatLocation == 'editor' || inChatInput && chatLocation == 'notebook' || inChatInput && chatLocation == 'panel' || inChatInput && chatLocation == 'terminal'"
},
{
"key": "ctrl+/",
"command": "-workbench.action.terminal.sendSequence",
"when": "terminalFocus"
},
{
"key": "shift+alt+l",
"command": "workbench.action.editorLayoutTwoRowsRight"
},
{
"key": "ctrl+b",
"command": "-extension.vim_ctrl+b",
"when": "editorTextFocus && vim.active && vim.use<C-b> && !inDebugRepl && vim.mode != 'Insert'"
},
{
"key": "ctrl+w",
"command": "-workbench.action.closeActiveEditor"
},
{
"key": "ctrl+w",
"command": "-workbench.action.closeGroup",
"when": "activeEditorGroupEmpty && multipleEditorGroups"
},
{
"key": "ctrl+w",
"command": "-extension.vim_ctrl+w",
"when": "editorTextFocus && vim.active && vim.use<C-w> && !inDebugRepl"
},
{
"key": "ctrl+w",
"command": "workbench.action.closeActiveEditor",
"when": "editorTextFocus"
}
]

@ -0,0 +1,151 @@
{
"editor.wordWrap": "on",
"editor.minimap.autohide": true,
"editor.minimap.maxColumn": 80,
"editor.minimap.size": "fit",
"python.experiments.enabled": false,
"debugpy.debugJustMyCode": false,
"python.REPL.enableREPLSmartSend": false,
"python.terminal.activateEnvironment": false,
"python.testing.autoTestDiscoverOnSaveEnabled": false,
"python.languageServer": "None",
"typescript.surveys.enabled": false,
"typescript.suggestionActions.enabled": false,
"typescript.tsserver.enableRegionDiagnostics": false,
"typescript.tsserver.maxTsServerMemory": 0.05,
"typescript.tsserver.useSyntaxServer": "never",
"typescript.tsserver.web.typeAcquisition.enabled": false,
"typescript.validate.enable": false,
"typescript.workspaceSymbols.excludeLibrarySymbols": false,
"typescript.check.npmIsInstalled": false,
"typescript.tsserver.web.projectWideIntellisense.enabled": false,
"python.REPL.provideVariables": false,
"git.openRepositoryInParentFolders": "never",
"workbench.enableExperiments": false,
"workbench.cloudChanges.continueOn": "off",
"workbench.cloudChanges.autoResume": "off",
"extensions.autoCheckUpdates": false,
"update.mode": "none",
"workbench.settings.enableNaturalLanguageSearch": false,
"update.showReleaseNotes": false,
"extensions.autoUpdate": false,
"telemetry.telemetryLevel": "off",
"json.schemaDownload.enable": false,
"npm.fetchOnlinePackageInfo": false,
"window.experimentalControlOverlay": false,
"window.commandCenter": false,
"window.confirmBeforeClose": "always",
"window.dialogStyle": "custom",
"window.titleBarStyle": "custom",
"window.customTitleBarVisibility": "windowed",
"window.enableMenuBarMnemonics": false,
"window.menuBarVisibility": "compact",
"issueReporter.experimental.auxWindow": false,
"workbench.colorTheme": "Monokai",
"workbench.preferredDarkColorTheme": "Monokai",
"workbench.preferredHighContrastColorTheme": "Monokai",
"workbench.preferredHighContrastLightColorTheme": "Monokai",
"workbench.preferredLightColorTheme": "Monokai",
"mesonbuild.downloadLanguageServer": false,
// "vim.easymotion": true,
// "vim.incsearch": true,
"vim.useSystemClipboard": true,
// "vim.useCtrlKeys": true,
"vim.hlsearch": true,
// "vim.insertModeKeyBindings": [
// {
// "before": ["j", "j"],
// "after": ["<Esc>"]
// }
// ],
"vim.normalModeKeyBindingsNonRecursive": [
{
"before": ["<leader>", "w"],
"after": ["<C-w>"],
// "after": ["d", "d"]
},
// {
// "before": ["<C-n>"],
// "commands": [":nohl"]
// },
// {
// "before": ["K"],
// "commands": ["lineBreakInsert"],
// "silent": true
// }
],
"vim.leader": "\\",
// "vim.handleKeys": {
// "<C-a>": false,
// "<C-f>": false
// },
"extensions.experimental.affinity": {
"vscodevim.vim": 1
},
"diffEditor.experimental.showMoves": true,
"diffEditor.hideUnchangedRegions.enabled": true,
"python.locator": "native",
"python.testing.promptToConfigure": false,
"typescript.format.enable": false,
"typescript.format.indentSwitchCase": false,
"typescript.preferences.renameMatchingJsxTags": false,
"typescript.autoClosingTags": false,
"typescript.format.insertSpaceAfterCommaDelimiter": false,
"typescript.format.insertSpaceAfterKeywordsInControlFlowStatements": false,
"typescript.format.insertSpaceAfterOpeningAndBeforeClosingEmptyBraces": false,
"docker.enableDockerComposeLanguageService": false,
"go.useLanguageServer": false,
"search.maxResults": 128,
"search.ripgrep.maxThreads": 1,
"search.searchEditor.defaultNumberOfContextLines": 7,
"search.searchOnType": false,
"task.allowAutomaticTasks": "off",
"task.autoDetect": "off",
"task.quickOpen.detail": false,
"task.reconnection": false,
"javascript.autoClosingTags": false,
"javascript.format.enable": false,
"javascript.format.insertSpaceAfterCommaDelimiter": false,
"javascript.format.insertSpaceAfterFunctionKeywordForAnonymousFunctions": false,
"javascript.format.insertSpaceAfterKeywordsInControlFlowStatements": false,
"javascript.format.insertSpaceAfterOpeningAndBeforeClosingEmptyBraces": false,
"javascript.format.insertSpaceAfterOpeningAndBeforeClosingNonemptyBraces": false,
"javascript.format.insertSpaceAfterSemicolonInForStatements": false,
"javascript.format.insertSpaceBeforeAndAfterBinaryOperators": false,
"javascript.inlayHints.parameterNames.suppressWhenArgumentMatchesName": false,
"javascript.inlayHints.variableTypes.suppressWhenTypeMatchesName": false,
"javascript.preferences.renameMatchingJsxTags": false,
"javascript.preferences.useAliasesForRenames": false,
"javascript.suggest.autoImports": false,
"javascript.suggest.classMemberSnippets.enabled": false,
"javascript.suggest.completeJSDocs": false,
"javascript.suggest.enabled": false,
"javascript.suggest.includeAutomaticOptionalChainCompletions": false,
"javascript.suggest.includeCompletionsForImportStatements": false,
"javascript.suggest.jsdoc.generateReturns": false,
"javascript.suggest.names": false,
"javascript.suggest.paths": false,
"javascript.suggestionActions.enabled": false,
"javascript.updateImportsOnFileMove.enabled": "never",
"javascript.validate.enable": false,
"js/ts.implicitProjectConfig.strictFunctionTypes": false,
"js/ts.implicitProjectConfig.strictNullChecks": false,
"typescript.format.insertSpaceAfterFunctionKeywordForAnonymousFunctions": false,
"typescript.format.insertSpaceAfterOpeningAndBeforeClosingNonemptyBraces": false,
"typescript.format.insertSpaceAfterSemicolonInForStatements": false,
"typescript.format.insertSpaceBeforeAndAfterBinaryOperators": false,
"typescript.inlayHints.parameterNames.suppressWhenArgumentMatchesName": false,
"typescript.inlayHints.variableTypes.suppressWhenTypeMatchesName": false,
"typescript.preferences.useAliasesForRenames": false,
"typescript.reportStyleChecksAsWarnings": false,
"typescript.suggest.autoImports": false,
"typescript.suggest.classMemberSnippets.enabled": false,
"typescript.suggest.completeJSDocs": false,
"typescript.suggest.enabled": false,
"typescript.suggest.includeAutomaticOptionalChainCompletions": false,
"typescript.suggest.includeCompletionsForImportStatements": false,
"typescript.suggest.jsdoc.generateReturns": false,
"typescript.suggest.objectLiteralMethodSnippets.enabled": false,
"typescript.suggest.paths": false,
"typescript.tsc.autoDetect": "off",
}

282
dotfiles/.config/katerc Normal file

@ -0,0 +1,282 @@
[BuildConfig]
AllowedCommandLines=
AutoSwitchToOutput=true
BlockedCommandLines=
UseDiagnosticsOutput=true
[CTags]
GlobalCommand=ctags -R --c++-types=+px --extra=+q --excmd=pattern --exclude=Makefile --exclude=.
GlobalNumTargets=0
[General]
Allow Tab Scrolling=true
Auto Hide Tabs=false
Close After Last=false
Close documents with window=true
Cycle To First Tab=true
Days Meta Infos=30
Diagnostics Limit=12000
Diff Show Style=0
Elide Tab Text=false
Enable Context ToolView=false
Expand Tabs=false
Icon size for left and right sidebar buttons=32
Last Session=calibre
Modified Notification=false
Mouse back button action=0
Mouse forward button action=0
Open New Tab To The Right Of Current=true
Output History Limit=100
Output With Date=false
Quickopen Filter Mode=0
Quickopen List Mode=true
Recent File List Entry Count=10
Restore Window Configuration=true
SDI Mode=false
Save Meta Infos=false
Session Manager Sort Column=0
Session Manager Sort Order=1
Show Full Path in Title=true
Show Menu Bar=true
Show Status Bar=true
Show Symbol In Navigation Bar=true
Show Tab Bar=true
Show Tabs Close Button=true
Show Url Nav Bar=false
Show output view for message type=1
Show text for left and right sidebar=false
Show welcome view for new window=true
Startup Session=manual
Stash new unsaved files=true
Stash unsaved file changes=false
Sync section size with tab positions=false
Tab Double Click New Document=true
Tab Middle Click Close Document=true
Tabbar Tab Limit=0
[KDE]
widgetStyle=Fusion
[KTextEditor Document]
Allow End of Line Detection=true
Auto Detect Indent=true
Auto Reload If State Is In Version Control=true
Auto Save=false
Auto Save Interval=0
Auto Save On Focus Out=false
BOM=false
Backup Local=false
Backup Prefix=
Backup Remote=false
Backup Suffix=~
Camel Cursor=true
Encoding=UTF-8
End of Line=0
Indent On Backspace=true
Indent On Tab=true
Indent On Text Paste=true
Indentation Mode=normal
Indentation Width=4
Keep Extra Spaces=false
Line Length Limit=10000
Newline at End of File=true
On-The-Fly Spellcheck=false
Overwrite Mode=false
PageUp/PageDown Moves Cursor=false
Remove Spaces=1
ReplaceTabsDyn=true
Show Spaces=2
Show Tabs=true
Smart Home=true
Swap Directory=
Swap File Mode=1
Swap Sync Interval=15
Tab Handling=2
Tab Width=2
Trailing Marker Size=1
Use Editor Config=true
Word Wrap=false
Word Wrap Column=80
[KTextEditor Renderer]
Animate Bracket Matching=false
Auto Color Theme Selection=false
Color Theme=Monokai2
Line Height Multiplier=1
Show Indentation Lines=false
Show Whole Bracket Expression=false
Text Font=Terminus,11,-1,5,400,0,0,0,0,0,0,0,0,0,0,1
Text Font Features=
Word Wrap Marker=true
[KTextEditor View]
Allow Mark Menu=true
Auto Brackets=true
Auto Center Lines=0
Auto Completion=true
Auto Completion Preselect First Entry=true
Backspace Remove Composed Characters=false
Bookmark Menu Sorting=0
Bracket Match Preview=true
Chars To Enclose Selection=<>(){}[]'"
Cycle Through Bookmarks=true
Default Mark Type=1
Dynamic Word Wrap=true
Dynamic Word Wrap Align Indent=80
Dynamic Word Wrap At Static Marker=false
Dynamic Word Wrap Indicators=1
Dynamic Wrap not at word boundaries=false
Enable Accessibility=true
Enable Tab completion=false
Enter To Insert Completion=true
Fold First Line=false
Folding Bar=true
Folding Preview=true
Icon Bar=false
Input Mode=1
Keyword Completion=true
Line Modification=true
Line Numbers=true
Max Clipboard History Entries=20
Maximum Search History Size=100
Mouse Paste At Cursor Position=false
Multiple Cursor Modifier=134217728
Persistent Selection=false
Scroll Bar Marks=false
Scroll Bar Mini Map All=true
Scroll Bar Mini Map Width=60
Scroll Bar MiniMap=false
Scroll Bar Preview=true
Scroll Past End=false
Search/Replace Flags=140
Shoe Line Ending Type in Statusbar=false
Show Documentation With Completion=true
Show File Encoding=true
Show Folding Icons On Hover Only=true
Show Line Count=true
Show Scrollbars=0
Show Statusbar Dictionary=true
Show Statusbar Highlighting Mode=true
Show Statusbar Input Mode=true
Show Statusbar Line Column=true
Show Statusbar Tab Settings=true
Show Word Count=true
Smart Copy Cut=true
Statusbar Line Column Compact Mode=true
Text Drag And Drop=true
User Sets Of Chars To Enclose Selection=
Vi Input Mode Steal Keys=false
Vi Relative Line Numbers=false
Word Completion=true
Word Completion Minimal Word Length=3
Word Completion Remove Tail=true
[Konsole]
AutoSyncronizeMode=0
KonsoleEscKeyBehaviour=false
KonsoleEscKeyExceptions=vi,vim,nvim,git
RemoveExtension=false
RunPrefix=
SetEditor=false
[MainWindow]
1366x768 screen: Window-Maximized=true
2 screens: Height=727
2 screens: Width=679
2048x1080 screen: Window-Maximized=true
ToolBarsMovable=Disabled
[PluginSymbolViewer]
ExpandTree=false
SortSymbols=false
TreeView=false
ViewTypes=false
[Printing][HeaderFooter]
FooterBackground=211,211,211
FooterBackgroundEnabled=false
FooterEnabled=true
FooterForeground=0,0,0
FooterFormatCenter=
FooterFormatLeft=
FooterFormatRight=%U
HeaderBackground=211,211,211
HeaderBackgroundEnabled=false
HeaderEnabled=true
HeaderFooterFont=monospace,10,-1,5,400,0,0,0,1,0,0,0,0,0,0,1
HeaderForeground=0,0,0
HeaderFormatCenter=%f
HeaderFormatLeft=%y
HeaderFormatRight=%p
[Printing][Layout]
BackgroundColorEnabled=false
BoxColor=invalid
BoxEnabled=false
BoxMargin=6
BoxWidth=1
ColorScheme=Printing
Font=monospace,10,-1,5,400,0,0,0,1,0,0,0,0,0,0,1
[Printing][Text]
DontPrintFoldedCode=true
Legend=false
LineNumbers=false
[Shortcut Schemes]
Current Scheme=Default
[Shortcuts]
kate_mdi_focus_toolview_kate_private_plugin_katekonsoleplugin=;\s
kate_mdi_sidebar_visibility=;\s
kate_mdi_toolview_kate_private_plugin_katekonsoleplugin=;\s
kate_mdi_toolview_kateproject=Ctrl+B
kate_mdi_toolview_kateprojectinfo=Alt+T
[debugplugin]
DAPConfiguration=
[filetree]
editShade=183,220,246
listMode=false
middleClickToClose=false
shadingEnabled=true
showCloseButton=false
showFullPathOnRoots=false
showToolbar=true
sortRole=0
viewShade=211,190,222
[lspclient]
AllowedServerCommandLines=/usr/bin/clangd -log=error --background-index --limit-results=500 --completion-style=bundled,/usr/bin/pylsp --check-parent-process
AutoHover=true
AutoImport=true
BlockedServerCommandLines=/usr/bin/python -m esbonio
CompletionDocumentation=true
CompletionParens=true
Diagnostics=true
FormatOnSave=false
HighlightGoto=true
IncrementalSync=true
InlayHints=false
Messages=true
ReferencesDeclaration=true
SemanticHighlighting=true
ServerConfiguration=
SignatureHelp=true
SymbolDetails=false
SymbolExpand=true
SymbolSort=false
SymbolTree=true
TypeFormatting=false
[project]
autoCMake=false
autorepository=git
gitStatusDoubleClick=3
gitStatusSingleClick=0
index=false
indexDirectory=
multiProjectCompletion=false
multiProjectGoto=false
restoreProjectsForSessions=false

@ -3,3 +3,5 @@
name = Siarhei Siniak
[core]
pager = less -x2
[fetch]
fsckObjects = true

@ -0,0 +1,13 @@
#!/usr/bin/bash
commands gnome-shortcuts \
-a \
'powersave' \
'commands desktop-services --cpufreq-action powersave' \
'<Shift><Alt>1'
commands gnome-shortcuts \
-a \
'performance' \
'commands desktop-services --cpufreq-action performance' \
'<Shift><Alt>2'

2
dotfiles/.mime.types Normal file

@ -0,0 +1,2 @@
# https://terminalroot.com/how-to-open-markdown-files-with-md-extension-in-firefox/
text/plain txt asc text pm el c h cc hh cxx hxx f90 conf log yaml yml

@ -41,7 +41,7 @@ def f5_1(pattern, flags, info):
#print('fuck')
if b'r' in flags:
while True:
ext_m = re.compile('^.([^\,]+),(.*)$').match(pattern)
ext_m = re.compile(r'^.([^\,]+),(.*)$').match(pattern)
if pattern[:3] in [r'\r,']:
options['recursive'] = True

@ -1,4 +1,3 @@
#
# Copy this to ~/.config/sway/config and edit it to your liking.
#
@ -20,21 +19,10 @@ set $term weston-terminal
# on the original workspace that the command was run on.
#for_window [app_id="^launcher$"] floating enable, sticky enable, resize set 30 ppt 60 ppt, border pixel 10
#set $menu exec $term --class=launcher -e /usr/bin/sway-launcher-desktop
set $dmenu_path /usr/bin/bemenu-run
#set $dmenu_path /usr/bin/bemenu-run
set $dmenu_path rofi -modes run -show run
set $menu $dmenu_path | xargs swaymsg exec --
### Output configuration
#
# Default wallpaper (more resolutions are available in /usr/share/backgrounds/sway/)
#output * bg /usr/share/backgrounds/sway/Sway_Wallpaper_Blue_1920x1080.png fill
#
# Example configuration:
#
# output HDMI-A-1 resolution 1920x1080 position 1920,0
#
# You can get the names of your outputs by running: swaymsg -t get_outputs
output HDMI-A-1 resolution 1920x1080 position 0,0
output eDP-1 resolution 1366x748 position 277,1080
### Idle configuration
#
@ -62,18 +50,27 @@ output eDP-1 resolution 1366x748 position 277,1080
#
# You can get the names of your inputs by running: swaymsg -t get_inputs
# Read `man 5 sway-input` for more information about this section.
input type:pointer {
# tap enabled
natural_scroll enabled
}
input type:touchpad {
tap enabled
natural_scroll enabled
# natural_scroll disabled
}
bindgesture swipe:4:left workspace next
bindgesture swipe:4:right workspace prev
for_window [shell="xwayland"] title_format "[XWayland] %title"
#set $lock_cmd \
# loginctl list-sessions | \
# tail '-n' +2 | head -n -2 | awk '{print $1}' | \
# xargs loginctl lock-session
set $lock_cmd \
loginctl list-sessions | \
tail '-n' +2 | head -n -2 | awk '{print $1}' | \
xargs loginctl lock-session
zsh -c "commands loginctl --action lock-session"
bindgesture swipe:4:up exec $lock_cmd
@ -82,35 +79,48 @@ bindgesture swipe:4:up exec $lock_cmd
#
# Basics:
#
bindsym $mod+Shift+l exec $lock_cmd
bindsym Shift+$mod+l exec $lock_cmd
bindsym XF86KbdBrightnessDown \
exec commands \
bindsym --locked Shift+mod1+1 \
exec ~/.local/bin/commands \
desktop-services \
--backlight-decrease \
--backlight-type keyboard
--cpufreq-action performance
bindsym XF86KbdBrightnessUp \
exec commands \
bindsym --locked Shift+mod1+2 \
exec ~/.local/bin/commands \
desktop-services \
--backlight-increase \
--backlight-type keyboard
--cpufreq-action powersave
bindsym XF86MonBrightnessDown \
exec commands \
bindsym --locked XF86MonBrightnessDown \
exec ~/.local/bin/commands \
desktop-services \
--backlight-decrease \
--backlight-type output
bindsym XF86MonBrightnessUp \
exec commands \
bindsym --locked XF86MonBrightnessUp \
exec ~/.local/bin/commands \
desktop-services \
--backlight-increase \
--backlight-type output
bindsym XF86AudioPlay exec bash -c "commands media-play-pause"
bindsym XF86AudioNext exec bash -c "commands media-next"
bindsym XF86AudioPrev exec bash -c "commands media-prev"
bindsym --locked XF86KbdBrightnessDown \
exec ~/.local/bin/commands \
desktop-services \
--backlight-decrease \
--backlight-type keyboard
bindsym --locked XF86KbdBrightnessUp \
exec ~/.local/bin/commands \
desktop-services \
--backlight-increase \
--backlight-type keyboard
bindsym --locked XF86AudioPlay exec zsh -c "commands media-play-pause"
bindsym --locked XF86AudioRaiseVolume exec zsh -c "commands media-raise-volume"
bindsym --locked XF86AudioLowerVolume exec zsh -c "commands media-lower-volume"
bindsym --locked XF86AudioMute exec zsh -c "commands media-toggle-volume"
bindsym --locked XF86AudioNext exec zsh -c "commands media-next"
bindsym --locked XF86AudioPrev exec zsh -c "commands media-prev"
# Start a terminal
@ -133,11 +143,19 @@ floating_modifier $mod normal
bindsym $mod+Shift+c reload
# Exit sway (logs you out of your Wayland session)
bindsym $mod+Shift+e exec swaynag -t warning -m 'You pressed the exit shortcut. Do you really want to exit sway? This will end your Wayland session.' -b 'Yes, exit sway' 'swaymsg exit'
bindsym $mod+Shift+e \
exec swaynag -t warning \
-m 'You pressed the exit shortcut. Do you really want to exit sway? This will end your Wayland session.' \
-b 'Yes, exit sway' \
'swaymsg exit'
#
# Moving around:
#
# Move your focus around
bindsym Shift+mod1+tab focus prev
bindsym mod1+tab focus next
#bindsym mod1+tab focus mode_toggle
bindsym $mod+$left focus left
bindsym $mod+$down focus down
bindsym $mod+$up focus up
@ -201,6 +219,7 @@ bindsym $mod+v splitv
#bindsym $mod+s layout stacking
#bindsym $mod+w layout tabbed
#bindsym $mod+e layout toggle split
bindsym $mod+e layout toggle all
# Make the current focus fullscreen
bindsym $mod+f fullscreen
@ -212,7 +231,8 @@ bindsym $mod+p floating toggle
## Swap focus between the tiling area and the floating area
#bindsym $mod+space focus mode_toggle
bindsym --release Print exec bash -c "commands wl-screenshot"
bindsym --release Print exec zsh -c "commands wl-screenshot"
bindsym --release $mod+s exec zsh -c "commands wl-screenshot"
# Move focus to the parent container
#bindsym $mod+a focus parent
@ -253,6 +273,40 @@ mode "resize" {
}
bindsym $mod+r mode "resize"
set $black #000000
set $red #ff0000
set $green #00ff00
set $blue #0000ff
set $white #ffffff
set $grey #757575
set $pale_green #9df882
set $pale_green2 #6baf54
set $dark_green #1a7000
set $pale_blue #7da9f9
set $dark_blue #005ba6
set $pale_greenblue #2da078
set $pale_greenblue2 #66c473
set $yellow #fffd0d
set $dark_yellow #908f00
set $color1 #18ff00
set $color2 #000000
set $color3 #ff00ff
set $color4 #ff0000
set $color5 #00000000
set $color6 #00000000
set $color7 #00000000
set $border_focused $pale_green
set $border_unfocused $color2
set $background_focused $pale_greenblue2
set $background_unfocused $grey
set $child_border_focused $white
set $child_border_unfocused $color2
set $bright_text $white
set $dark_text $black
#
# Status Bar:
#
@ -263,7 +317,7 @@ bar {
# When the status_command prints a new line to stdout, swaybar updates.
# The default just shows the current date and time.
status_command while true; \
do commands status --config ~/.config/commands-status.json; \
do ~/.local/bin/commands status --config ~/.config/commands-status.json; \
sleep 1; \
done
@ -272,27 +326,34 @@ bar {
height 16
colors {
statusline #565656
background #dfdfdf
inactive_workspace #dfdfdf #dfdfdf #000000
active_workspace #dfdfdf #efefef #000000
focused_workspace #dfdfdf #efefef #000000
statusline $bright_text
background $pale_green2
inactive_workspace $black $white $dark_text
active_workspace $black $white $bright_text
focused_workspace $dark_yellow $yellow $dark_text
}
}
client.focused #f3f3f3 #dfdfdf #565656 #f3f3f3 #f3f3f3
client.unfocused #f3f3f3 #dfdfdf #565656 #f3f3f3 #f3f3f3
#client.focused #f3f3f3 #dfdfdfdd #565656 #f3f3f3 #f3f3f3
client.focused $border_focused $background_focused $white $white $child_border_focused
client.unfocused $border_unfocused $background_unfocused $white $white $child_border_unfocused
for_window [all] border 1
#font pango:Helvetica Neue 10
font pango:Terminus 10
font pango:Terminus 12
titlebar_padding 1 4
titlebar_padding 32 1
titlebar_border_thickness 1
title_align center
#for_window [class=".*"] title_format "<b>%title</b>"
for_window [class="^firefox$"] floating enable
for_window [all] opacity set 0.95
input * {
xkb_layout "us,ru"
xkb_options "grp:win_space_toggle"
@ -300,3 +361,4 @@ input * {
input type:keyboard xkb_model "pc101"
include /etc/sway/config.d/*
include ~/.sway/config.d/*

@ -0,0 +1,20 @@
### Output configuration
#
# Default wallpaper (more resolutions are available in /usr/share/backgrounds/sway/)
#output * bg /usr/share/backgrounds/sway/Sway_Wallpaper_Blue_1920x1080.png fill
#
# Example configuration:
#
# output HDMI-A-1 resolution 1920x1080 position 1920,0
#
# You can get the names of your outputs by running: swaymsg -t get_outputs
#2560 x 1440
output 'Dell Inc. DELL P2418D MY3ND8220WKT' resolution 1920x1080 position 0,0
#output 'Dell Inc. DELL P2418D MY3ND8220WKT' mode resolution 2560x1440 position 0,0
#output HDMI-A-1 resolution 1920x1080 transform 90 position 0,0
output 'LG Electronics LG FHD 403TOAG3C208 ' resolution 1920x1080 transform 90 position 0,0
#output eDP-1 resolution 1366x748 position 277,1080
#output eDP-1 resolution 1366x748 disable power off position 277,1080
output 'Apple Computer Inc Color LCD Unknown' \
resolution 1366x748 enable power on position 277,1080
bindsym --locked $mod+u output 'Apple Computer Inc Color LCD Unknown' toggle

@ -31,19 +31,33 @@ hi MatchParen guifg=white guibg=black gui=NONE ctermfg=1 ctermbg=0
function! MakeSession()
let b:sessiondir = '.vim/'
if exists('g:session_name')
let b:session_name = g:session_name
else
let b:session_name = 'session'
endif
if (filewritable(b:sessiondir) != 2)
exe 'silent !mkdir -p ' b:sessiondir
redraw!
endif
let b:filename = b:sessiondir . '/session.vim'
let b:filename = b:sessiondir . '/' . b:session_name . '.vim'
exe "mksession! " . b:filename
echo 'saved ' . b:session_name
endfunction
function! LoadSession()
let b:sessiondir = '.vim/'
let b:sessionfile = b:sessiondir . "/session.vim"
if (filereadable(b:sessionfile))
exe 'source ' b:sessionfile
if exists('g:session_name')
let b:session_name = g:session_name
else
let b:session_name = 'session'
endif
let b:filename = b:sessiondir . '/' . b:session_name . '.vim'
if (filereadable(b:filename))
exe 'source ' b:filename
" echo 'loaded ' . b:session_name
else
echo "No session loaded."
endif
@ -56,6 +70,7 @@ map <Leader>z :wqa<CR>
map <Leader>m :py3 f1()<CR>
map <Leader>r :redraw!<CR>
map <Leader>s :call MakeSession()<CR>
map <Leader>% :let g:session_name = 'session'
map <Leader>l :call LoadSession()
map <Leader>cq :cq<CR>
map <Leader>f2 :py3 f2()<CR>

@ -1,2 +1,6 @@
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/pci0000:00/0000:00:1b.0/hdaudioC0D0/leds/hda::mute", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/platform/applesmc.768/leds/smc::kbd_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", DEVPATH=="/devices/platform/applesmc.768", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/fan1_manual /sys$devpath/fan1_output"
ACTION=="add|change", DEVPATH=="/class/backlight/intel_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", DEVPATH=="/devices/system/cpu/", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/cpufreq/scaling_governor"
ACTION=="add|change", KERNEL=="cpu[0-9]", SUBSYSTEM=="cpu", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/cpufreq/scaling_governor"

295
m.py Executable file

@ -0,0 +1,295 @@
#!/usr/bin/env python3
import glob
import io
import tempfile
import dataclasses
import pathlib
import sys
import subprocess
import os
import logging
import tomllib
from typing import (Self, Optional, Any,)
logger = logging.getLogger(__name__)
@dataclasses.dataclass
class PyProject:
path: pathlib.Path
dependencies: dict[str, list[str]]
early_features: Optional[list[str]] = None
pip_find_links: Optional[list[pathlib.Path]] = None
runtime_libdirs: Optional[list[pathlib.Path]] = None
runtime_preload: Optional[list[pathlib.Path]] = None
requirements: dict[str, pathlib.Path] = dataclasses.field(default_factory=lambda : dict())
def pyproject_load(
d: pathlib.Path,
) -> PyProject:
with io.open(d, 'rb') as f:
content = tomllib.load(f)
assert isinstance(content, dict)
dependencies : dict[str, list[str]] = dict()
dependencies['default'] = content['project']['dependencies']
if (
'optional-dependencies' in content['project']
):
assert isinstance(
content['project']['optional-dependencies'],
dict
)
for k, v in content['project']['optional-dependencies'].items():
assert isinstance(v, list)
assert isinstance(k, str)
dependencies[k] = v
res = PyProject(
path=d,
dependencies=dependencies,
)
tool_name = 'online.fxreader.pr34'.replace('.', '-')
if (
'tool' in content and
isinstance(
content['tool'], dict
) and
tool_name in content['tool'] and
isinstance(
content['tool'][tool_name],
dict
)
):
if 'early_features' in content['tool'][tool_name]:
res.early_features = content['tool'][tool_name]['early_features']
if 'pip_find_links' in content['tool'][tool_name]:
res.pip_find_links = [
d.parent / pathlib.Path(o)
for o in content['tool'][tool_name]['pip_find_links']
]
if 'runtime_libdirs' in content['tool'][tool_name]:
res.runtime_libdirs = [
d.parent / pathlib.Path(o)
# pathlib.Path(o)
for o in content['tool'][tool_name]['runtime_libdirs']
]
if 'runtime_preload' in content['tool'][tool_name]:
res.runtime_preload = [
d.parent / pathlib.Path(o)
# pathlib.Path(o)
for o in content['tool'][tool_name]['runtime_preload']
]
if 'requirements' in content['tool'][tool_name]:
assert isinstance(content['tool'][tool_name]['requirements'], dict)
res.requirements = {
k : d.parent / pathlib.Path(v)
# pathlib.Path(o)
for k, v in content['tool'][tool_name]['requirements'].items()
}
return res
@dataclasses.dataclass
class BootstrapSettings:
env_path: pathlib.Path
python_path: pathlib.Path
base_dir: pathlib.Path
uv_args: list[str] = dataclasses.field(
default_factory=lambda : os.environ.get(
'UV_ARGS',
'--offline',
).split(),
)
@classmethod
def get(
cls,
base_dir: Optional[pathlib.Path] = None,
) -> Self:
if base_dir is None:
base_dir = pathlib.Path.cwd()
env_path = base_dir / '.venv'
python_path = env_path / 'bin' / 'python3'
return cls(
base_dir=base_dir,
env_path=env_path,
python_path=python_path,
)
def env_bootstrap(
bootstrap_settings: BootstrapSettings,
pyproject: PyProject,
) -> None:
pip_find_links : list[pathlib.Path] = []
if not pyproject.pip_find_links is None:
pip_find_links.extend(pyproject.pip_find_links)
pip_find_links_args = sum([
['-f', str(o),]
for o in pip_find_links
], [])
features : list[str] = []
if pyproject.early_features:
features.extend(pyproject.early_features)
requirements_name = '_'.join(sorted(features))
requirements_path : Optional[pathlib.Path] = None
if requirements_name in pyproject.requirements:
requirements_path = pyproject.requirements[requirements_name]
else:
requirements_path = pyproject.path.parent / 'requirements.txt'
requirements_in : list[str] = []
requirements_in.extend([
'uv', 'pip', 'build', 'setuptools', 'meson-python', 'pybind11'
])
if pyproject.early_features:
early_dependencies = sum([
pyproject.dependencies[o]
for o in pyproject.early_features
], [])
logger.info(dict(
early_dependencies=early_dependencies,
))
requirements_in.extend(early_dependencies)
# if len(early_dependencies) > 0:
# subprocess.check_call([
# bootstrap_settings.python_path,
# '-m',
# 'uv', 'pip', 'install',
# *pip_find_links_args,
# # '-f', str(pathlib.Path(__file__).parent / 'deps' / 'dist'),
# *bootstrap_settings.uv_args,
# *early_dependencies,
# ])
if not requirements_path.exists():
with tempfile.NamedTemporaryFile(
mode='w',
prefix='requirements',
suffix='.in',
) as f:
f.write(
'\n'.join(requirements_in)
)
f.flush()
subprocess.check_call([
'uv',
'pip',
'compile',
'--generate-hashes',
*pip_find_links_args,
# '-p',
# bootstrap_settings.python_path,
*bootstrap_settings.uv_args,
'-o', str(requirements_path),
f.name,
])
subprocess.check_call([
'uv', 'venv',
*pip_find_links_args,
# '--seed',
*bootstrap_settings.uv_args,
str(bootstrap_settings.env_path)
])
subprocess.check_call([
'uv',
'pip',
'install',
*pip_find_links_args,
'-p',
bootstrap_settings.python_path,
'--require-hashes',
*bootstrap_settings.uv_args,
'-r', str(requirements_path),
])
def paths_equal(
a: pathlib.Path | str,
b: pathlib.Path | str
) -> bool:
return (
os.path.abspath(str(a)) ==
os.path.abspath(str(b))
)
def run(
d: Optional[pathlib.Path] = None,
cli_path: Optional[pathlib.Path] = None,
) -> None:
if cli_path is None:
cli_path = pathlib.Path(__file__).parent / 'cli.py'
if d is None:
d = pathlib.Path(__file__).parent / 'pyproject.toml'
bootstrap_settings = BootstrapSettings.get()
pyproject : PyProject = pyproject_load(
d
)
logging.basicConfig(level=logging.INFO)
if not bootstrap_settings.env_path.exists():
env_bootstrap(
bootstrap_settings=bootstrap_settings,
pyproject=pyproject,
)
logger.info([sys.executable, sys.argv, bootstrap_settings.python_path])
if not paths_equal(sys.executable, bootstrap_settings.python_path):
os.execv(
str(bootstrap_settings.python_path),
[
str(bootstrap_settings.python_path),
*sys.argv,
]
)
os.execv(
str(bootstrap_settings.python_path),
[
str(bootstrap_settings.python_path),
str(
cli_path
),
*sys.argv[1:],
]
)
if __name__ == '__main__':
run(
d=pathlib.Path(__file__).parent / 'python' / 'pyproject.toml',
cli_path=pathlib.Path(__file__).parent / 'python' / 'cli.py',
)

@ -0,0 +1,6 @@
import distutils.command
from typing import (Any,)
def _get_build_extension() -> distutils.command.build_ext: ...
def load_dynamic(name: str, path: str) -> Any: ...

@ -0,0 +1,17 @@
import setuptools.extension
from typing import (Iterable,)
def cythonize(
module_list: str | Iterable[str]
#module_list,
#exclude=None,
#nthreads=0,
#aliases=None,
#quiet=False,
#force=None,
#language=None,
#exclude_failures=False,
#show_all_warnings=False,
#**options
) -> list[setuptools.extension.Extension]: ...

@ -0,0 +1,7 @@
from typing import (Type, Any, Self)
class NoGIL:
def __enter__(self) -> Self: ...
def __exit__(self, exc_class: Type[Exception], exc: Exception, tb: Any) -> None: ...
nogil : NoGIL = NoGIL()

@ -0,0 +1,14 @@
import setuptools.extension
import pathlib
class build_ext:
extensions : list[setuptools.extension.Extension]
#build_temp : pathlib.Path
#build_lib: pathlib.Path
build_temp: str
build_lib: str
def run(self) -> None:
...
...

@ -0,0 +1,8 @@
from typing import (Iterable,)
class Trie:
def __init__(self, entries: Iterable[str]) -> None: ...
def keys(self, entry: str) -> list[str]: ...
def __contains__(self, entry: str) -> bool: ...

@ -0,0 +1,18 @@
import setuptools.extension
from typing import (Any, Iterable,)
def mypycify(
paths: 'list[str]',
*,
only_compile_paths: 'Iterable[str] | None' = None,
verbose: 'bool' = False,
opt_level: 'str' = '3',
debug_level: 'str' = '1',
strip_asserts: 'bool' = False,
multi_file: 'bool' = False,
separate: 'bool | list[tuple[list[str], str | None]]' = False,
skip_cgen_input: 'Any | None' = None,
target_dir: 'str | None' = None,
include_runtime_files: 'bool | None' = None,
) -> 'list[setuptools.extension.Extension]': ...

@ -0,0 +1,8 @@
from typing import (Self, Any)
class tqdm:
def __enter__(self) -> Self: ...
def __exit__(self, args: Any) -> None: ...
def update(self, delta: int) -> None: ...
def set_description(self, description: str) -> None: ...

@ -0,0 +1,7 @@
def listen(
addr: tuple[str, int],
) -> None: ...
def wait_for_client() -> None: ...
def breakpoint() -> None: ...

@ -0,0 +1,6 @@
[Unit]
Description=udev scripts
[Service]
Type=simple
ExecStart=/usr/local/bin/online-fxreader-pr34-udev --device=%I

@ -0,0 +1,11 @@
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/pci0000:00/0000:00:1b.0/hdaudioC0D0/leds/hda::mute", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/platform/applesmc.768/leds/smc::kbd_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
# udevadm info --attribute-walk --path=/sys/devices/platform/applesmc.768/
# udevadm trigger --action=add --verbose --parent-match /devices/platform/applesmc.768/
#ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", RUN{program}+="ls -allh /sys$devpath/", OPTIONS="log_level=debug"
#ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", RUN{program}+="/usr/bin/ls -allh /sys$devpath/", OPTIONS="log_level=debug"
ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"
#KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", MODE="0660", TAG+="uaccess", OPTIONS="log_level=debug", OPTIONS+="watch"
ACTION=="add|change", DEVPATH=="/class/backlight/intel_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", KERNEL=="cpu0", SUBSYSTEM=="cpu", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"
ACTION=="add|change", KERNEL=="cpu[0-9]", SUBSYSTEM=="cpu", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"

@ -0,0 +1,103 @@
#!/usr/bin/python3
# vi: filetype=python
import re
import sys
import os
import subprocess
import argparse
import logging
from typing import (Any,)
logger = logging.getLogger(__name__)
def run() -> None:
logging.basicConfig(level=logging.INFO)
parser = argparse.ArgumentParser()
parser.add_argument(
'--device',
)
options = parser.parse_args()
DEVICES : dict[str, Any] = dict(
applesmc=dict(
devpath='sys/devices/platform/applesmc.768',
node='/sys/devices/platform/applesmc.768/fan1_manual',
cmd=r'''
chown root:fan /sys/devices/platform/applesmc.768/fan1_*
chmod g+w /sys/devices/platform/applesmc.768/fan1_*
''',
),
intel_pstate=dict(
devpath=r'/?sys/devices/system/cpu/cpu0',
node='/sys/devices/system/cpu/intel_pstate/no_turbo',
cmd=r'''
chown root:fan /sys/devices/system/cpu/intel_pstate/no_turbo
chown root:fan /sys/devices/system/cpu/intel_pstate/max_perf_pct
#chown root:fan /sys/devices/system/cpu/intel_pstate/status
chmod g+w /sys/devices/system/cpu/intel_pstate/no_turbo
chmod g+w /sys/devices/system/cpu/intel_pstate/max_perf_pct
#chmod g+w /sys/devices/system/cpu/intel_pstate/status
echo passive > /sys/devices/system/cpu/intel_pstate/status
chown root:fan /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor
chown root:fan /sys/devices/system/cpu/cpu*/cpufreq/scaling_max_freq
chmod g+w /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor
chmod g+w /sys/devices/system/cpu/cpu*/cpufreq/scaling_max_freq
''',
),
#governor=dict(
# devpath=r'/?sys/devices/system/cpu/cpu(\d+)',
# node=r'/sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor',
# cmd=r'''
# chown root:fan /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor
# chown root:fan /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_max_freq
# chmod g+w /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor
# chmod g+w /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_max_freq
# ''',
#),
)
processed : int = 0
logger.info(dict(device=options.device))
for k, v in DEVICES.items():
devpath = re.compile(v['devpath'])
devpath_m = devpath.match(options.device)
if devpath_m is None:
continue
node_2 = v['node'].format(*devpath_m.groups())
# logger.info(dict(devpath_m=devpath_m, node=node_2))
while not os.path.exists(node_2):
#continue
time.sleep(1)
cmd_2 = v['cmd'].format(*devpath_m.groups())
subprocess.check_call(cmd_2, shell=True)
logger.info(dict(
devpath_m=devpath_m,
node_2=node_2,
cmd_2=cmd_2,
msg='processed',
label=k,
))
processed += 1
if processed == 0:
raise NotImplementedError
if __name__ == '__main__':
run()

@ -0,0 +1,6 @@
[Unit]
Description=udev scripts
[Service]
Type=simple
ExecStart=/usr/local/bin/online-fxreader-pr34-udev --device=%I

@ -0,0 +1,26 @@
[Unit]
Description=Disable and Re-Enable Apple BCE Module (and Wi-Fi)
Before=sleep.target
Before=hibernate.target
StopWhenUnneeded=yes
[Service]
User=root
Type=oneshot
RemainAfterExit=yes
ExecStart=/usr/local/bin/online-fxreader-pr34-suspend-fix-t2 disable_apple_bce
#ExecStart=/usr/bin/modprobe -r apple_bce
#ExecStart=/usr/bin/modprobe -r brcmfmac_wcc
#ExecStart=/usr/bin/modprobe -r brcmfmac
#ExecStart=/usr/bin/rmmod -f apple-bce
ExecStop=/usr/local/bin/online-fxreader-pr34-suspend-fix-t2 enable_apple_bce
#ExecStop=/usr/bin/modprobe -r apple_bce
#ExecStop=/usr/bin/modprobe apple-bce
#ExecStop=/usr/bin/modprobe brcmfmac
#ExecStop=/usr/bin/modprobe brcmfmac_wcc
[Install]
WantedBy=sleep.target
WantedBy=hibernate.target

@ -0,0 +1,13 @@
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/pci0000:00/0000:00:1b.0/hdaudioC0D0/leds/hda::mute", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
ACTION=="add|change", SUBSYSTEM=="leds", DEVPATH=="/devices/platform/applesmc.768/leds/smc::kbd_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
# udevadm info --attribute-walk --path=/sys/devices/platform/applesmc.768/
# udevadm trigger --action=add --verbose --parent-match /devices/platform/applesmc.768/
#ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", RUN{program}+="ls -allh /sys$devpath/", OPTIONS="log_level=debug"
#ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", RUN{program}+="/usr/bin/ls -allh /sys$devpath/", OPTIONS="log_level=debug"
#ACTION=="add|change", KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"
ACTION=="add|change", KERNEL=="cpu0", SUBSYSTEM=="cpu", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"
#KERNEL=="applesmc.768", SUBSYSTEM=="platform", DRIVER=="applesmc", MODE="0660", TAG+="uaccess", OPTIONS="log_level=debug", OPTIONS+="watch"
ACTION=="add|change", DEVPATH=="/class/backlight/intel_backlight", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/brightness"
#ACTION=="add|change", DEVPATH=="/devices/system/cpu/", RUN{program}+="/usr/bin/chmod 666 /sys$devpath/cpufreq/scaling_governor"
ACTION=="add|change", KERNEL=="cpu[0-9]", SUBSYSTEM=="cpu", TAG+="systemd", ENV{SYSTEMD_WANTS}="online.fxreader.pr34.udev@$devnode.service", OPTIONS="log_level=debug"

@ -0,0 +1,113 @@
#!/usr/bin/env python3
# vi: set filetype=python
import sys
import time
import argparse
import subprocess
parser = argparse.ArgumentParser()
parser.add_argument('mode', choices=[
'disable_apple_bce',
'enable_apple_bce',
])
options = parser.parse_args()
if options.mode == 'disable_apple_bce':
while True:
ret = subprocess.call([
'systemctl', 'stop', 'iwd',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', '-r', 'brcmfmac_wcc',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', '-r', 'brcmfmac',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', '-r', 'applesmc',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'rmmod', '-f', 'apple-bce',
])
#if ret != 0:
# time.sleep(1)
#else:
# break
break
elif options.mode == 'enable_apple_bce':
while True:
ret = subprocess.call([
'modprobe', 'applesmc',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', 'apple-bce',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', 'brcmfmac',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'modprobe', 'brcmfmac_wcc',
])
if ret != 0:
time.sleep(1)
else:
break
while True:
ret = subprocess.call([
'systemctl', 'start', 'iwd',
])
if ret != 0:
time.sleep(1)
else:
break
else:
raise NotImplementedError

@ -0,0 +1,103 @@
#!/usr/bin/python3
# vi: filetype=python
import re
import sys
import os
import subprocess
import argparse
import logging
from typing import (Any,)
logger = logging.getLogger(__name__)
def run() -> None:
logging.basicConfig(level=logging.INFO)
parser = argparse.ArgumentParser()
parser.add_argument(
'--device',
)
options = parser.parse_args()
DEVICES : dict[str, Any] = dict(
applesmc=dict(
devpath='sys/devices/platform/applesmc.768',
node='/sys/devices/platform/applesmc.768/fan1_manual',
cmd=r'''
chown root:fan /sys/devices/platform/applesmc.768/fan1_*
chmod g+w /sys/devices/platform/applesmc.768/fan1_*
''',
),
intel_pstate=dict(
devpath=r'/?sys/devices/system/cpu/cpu0',
node='/sys/devices/system/cpu/intel_pstate/no_turbo',
cmd=r'''
chown root:fan /sys/devices/system/cpu/intel_pstate/no_turbo
chown root:fan /sys/devices/system/cpu/intel_pstate/max_perf_pct
#chown root:fan /sys/devices/system/cpu/intel_pstate/status
chmod g+w /sys/devices/system/cpu/intel_pstate/no_turbo
chmod g+w /sys/devices/system/cpu/intel_pstate/max_perf_pct
#chmod g+w /sys/devices/system/cpu/intel_pstate/status
echo passive > /sys/devices/system/cpu/intel_pstate/status
chown root:fan /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor
chown root:fan /sys/devices/system/cpu/cpu*/cpufreq/scaling_max_freq
chmod g+w /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor
chmod g+w /sys/devices/system/cpu/cpu*/cpufreq/scaling_max_freq
''',
),
#governor=dict(
# devpath=r'/?sys/devices/system/cpu/cpu(\d+)',
# node=r'/sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor',
# cmd=r'''
# chown root:fan /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor
# chown root:fan /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_max_freq
# chmod g+w /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_governor
# chmod g+w /sys/devices/system/cpu/cpu{0}/cpufreq/scaling_max_freq
# ''',
#),
)
processed : int = 0
logger.info(dict(device=options.device))
for k, v in DEVICES.items():
devpath = re.compile(v['devpath'])
devpath_m = devpath.match(options.device)
if devpath_m is None:
continue
node_2 = v['node'].format(*devpath_m.groups())
# logger.info(dict(devpath_m=devpath_m, node=node_2))
while not os.path.exists(node_2):
#continue
time.sleep(1)
cmd_2 = v['cmd'].format(*devpath_m.groups())
subprocess.check_call(cmd_2, shell=True)
logger.info(dict(
devpath_m=devpath_m,
node_2=node_2,
cmd_2=cmd_2,
msg='processed',
label=k,
))
processed += 1
if processed == 0:
raise NotImplementedError
if __name__ == '__main__':
run()

254
python/_m.py Normal file

@ -0,0 +1,254 @@
#!/usr/bin/env python3
#vim: set filetype=python
import logging
import json
import enum
import pathlib
import sys
import argparse
#import optparse
import dataclasses
import subprocess
import os
from typing import (
Optional, Any, TypeAlias, Literal, cast, BinaryIO, Generator,
ClassVar, Self,
)
logger = logging.getLogger()
@dataclasses.dataclass
class Settings:
project_root : pathlib.Path = pathlib.Path.cwd()
env_path : pathlib.Path = project_root / 'tmp' / 'env3'
_settings : ClassVar[Optional['Settings']] = None
@classmethod
def settings(cls) -> Self:
if cls._settings is None:
cls._settings = cls()
return cls._settings
def js(argv: list[str]) -> int:
return subprocess.check_call([
'sudo',
'docker-compose',
'--project-directory',
Settings.settings().project_root,
'-f',
Settings.settings().project_root / 'docker' / 'js' / 'docker-compose.yml',
*argv,
])
def env(
argv: Optional[list[str]] = None,
mode: Literal['exec', 'subprocess'] = 'subprocess',
**kwargs: Any,
) -> Optional[subprocess.CompletedProcess[bytes]]:
env_path = Settings.settings().env_path
if not env_path.exists():
subprocess.check_call([
sys.executable, '-m', 'venv',
'--system-site-packages',
str(env_path)
])
subprocess.check_call([
env_path / 'bin' / 'python3',
'-m', 'pip',
'install', '-r', 'requirements.txt',
])
if not argv is None:
python_path = str(env_path / 'bin' / 'python3')
if mode == 'exec':
os.execv(
python_path,
[
python_path,
*argv,
],
)
return None
elif mode == 'subprocess':
return subprocess.run([
python_path,
*argv,
], **kwargs)
else:
raise NotImplementedError
return None
def ruff(argv: list[str]) -> None:
parser = argparse.ArgumentParser()
parser.add_argument(
'-i',
dest='paths',
help='specify paths to check',
default=[],
action='append',
)
parser.add_argument(
'-e',
dest='exclude',
help='rules to ignore',
default=[],
action='append',
)
options, args = parser.parse_known_args(argv)
if len(options.paths) == 0:
options.paths.extend([
'.',
'dotfiles/.local/bin/commands',
])
if len(options.exclude) == 0:
options.exclude.extend([
'E731',
'E713',
'E714',
'E703',
])
res = env([
'-m',
'ruff',
'check',
*args,
'--output-format', 'json',
'--ignore', ','.join(options.exclude),
*options.paths,
], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
assert not res is None
errors = json.loads(res.stdout.decode('utf-8'))
g: dict[str, Any] = dict()
for o in errors:
if not o['filename'] in g:
g[o['filename']] = []
g[o['filename']].append(o)
h = {
k : len(v)
for k, v in g.items()
}
logger.info(json.dumps(errors, indent=4))
logger.info(json.dumps(h, indent=4))
def inside_env() -> bool:
try:
import numpy
return True
except Exception:
return False
#class Commands(enum.StrEnum):
# js = 'js'
# mypy = 'mypy'
# env = 'env'
# ruff = 'ruff'
# m2 = 'm2'
# def mypy(argv: list[str]) -> None:
# import online.fxreader.pr34.commands_typed.mypy as _mypy
# _mypy.run(
# argv,
# )
def host_deps(argv: list[str]) -> None:
if sys.platform in ['linux']:
subprocess.check_call(r'''
exec yay -S $(cat requirements-archlinux.txt)
''', shell=True,)
else:
raise NotImplementedError
Command_args = ['js', 'mypy', 'env', 'ruff', 'm2', 'host_deps',]
Command : TypeAlias = Literal['js', 'mypy', 'env', 'ruff', 'm2', 'host_deps',]
def run(argv: Optional[list[str]] = None) -> None:
logging.basicConfig(
level=logging.INFO,
format=(
'%(levelname)s:%(name)s:%(message)s'
':%(process)d'
':%(asctime)s'
':%(pathname)s:%(funcName)s:%(lineno)s'
),
)
if argv is None:
argv = sys.argv[:]
parser = argparse.ArgumentParser()
parser.add_argument(
'command',
#'_command',
choices=[
o
for o in Command_args
],
#required=True,
)
options, args = parser.parse_known_args(argv[1:])
assert options.command in Command_args
if len(args) > 0 and args[0] == '--':
del args[0]
#options.command = Commands(options._command)
if options.command == 'js':
js(args)
elif options.command == 'host_deps':
host_deps(args)
elif options.command == 'env':
env(args, mode='exec',)
# elif options.command == 'mypy':
# if not inside_env():
# env(
# [
# pathlib.Path(__file__).parent / 'm.py',
# *argv[1:],
# ],
# mode='exec'
# )
# else:
# mypy(args)
elif options.command == 'ruff':
ruff(args)
elif options.command == 'm2':
if not inside_env():
env(['--', '_m.py', 'm2', *args])
return
import python.tasks.cython
python.tasks.cython.mypyc_build(
pathlib.Path('_m.py')
)
else:
raise NotImplementedError
if __name__ == '__main__':
run()

162
python/cli.py Normal file

@ -0,0 +1,162 @@
import sys
import shutil
import glob
import io
import copy
import subprocess
import pathlib
import logging
import enum
import argparse
import dataclasses
from typing import (Optional, override,)
from online.fxreader.pr34.commands_typed.logging import setup as logging_setup
from online.fxreader.pr34.commands_typed import cli as _cli
from online.fxreader.pr34.commands_typed import cli_bootstrap
logging_setup()
logger = logging.getLogger(__name__)
class Command(enum.StrEnum):
mypy = 'mypy'
deploy_wheel = 'deploy:wheel'
tests = 'tests'
@dataclasses.dataclass
class Settings(
_cli.DistSettings,
):
base_dir: pathlib.Path = pathlib.Path(__file__).parent.parent
build_dir: pathlib.Path = base_dir / 'tmp' / 'build'
wheel_dir: pathlib.Path = base_dir / 'deps' / 'dist'
env_path: pathlib.Path = cli_bootstrap.BootstrapSettings.get(base_dir).env_path
python_path: pathlib.Path = cli_bootstrap.BootstrapSettings.get(base_dir).python_path
class CLI(_cli.CLI):
def __init__(self) -> None:
self.settings = Settings()
self._projects: dict[str, _cli.Project] = {
'online.fxreader.pr34': _cli.Project(
source_dir=self.settings.base_dir / 'python',
build_dir=self.settings.base_dir / 'tmp' / 'online' / 'fxreader' / 'pr34' / 'build',
dest_dir=self.settings.base_dir / 'tmp' / 'online' / 'fxreader' / 'pr34' / 'install',
)
}
self._dependencies : dict[str, _cli.Dependency] = dict()
@override
@property
def dist_settings(self) -> _cli.DistSettings:
return self.settings
@override
@property
def projects(self) -> dict[str, _cli.Project]:
return self._projects
def mypy(
self,
argv: list[str],
) -> None:
import online.fxreader.pr34.commands_typed.mypy as _mypy
project = self._projects['online.fxreader.pr34']
_mypy.run(
argv,
settings=_mypy.MypySettings(
paths=[
#Settings.settings().project_root / 'dotfiles/.local/bin/commands',
# project.source_dir / 'm.py',
project.source_dir / '_m.py',
project.source_dir / 'online',
project.source_dir / 'cli.py',
self.settings.base_dir / 'm.py',
# Settings.settings().project_root / 'deps/com.github.aiortc.aiortc/src',
#Settings.settings().project_root / 'm.py',
],
max_errors={
'python/online/fxreader/pr34/commands_typed': 0,
'python/cli.py': 0,
'm.py': 0,
'deps/com.github.aiortc.aiortc/src/online_fxreader': 0,
'deps/com.github.aiortc.aiortc/src/aiortc/contrib/signaling': 0
}
),
)
@override
@property
def dependencies(self) -> dict[str, _cli.Dependency]:
return self._dependencies
def run(self, argv: Optional[list[str]] = None) -> None:
if argv is None:
argv = copy.deepcopy(sys.argv)
parser = argparse.ArgumentParser()
parser.add_argument(
'command',
choices=[
o.value
for o in Command
]
)
parser.add_argument(
'-p', '--project',
choices=[
o
for o in self.projects
]
)
parser.add_argument(
'-o', '--output_dir',
default=None,
help='wheel output dir for deploy:wheel',
)
parser.add_argument(
'-f', '--force',
default=False,
action='store_true',
help='remove install dir, before installing, default = false',
)
options, args = parser.parse_known_args(argv[1:])
options.command = Command(options.command)
if options.command is Command.deploy_wheel:
assert not options.project is None
self.deploy_wheel(
project_name=options.project,
argv=args,
output_dir=options.output_dir,
mypy=True,
)
elif options.command is Command.mypy:
self.mypy(
argv=args,
)
elif options.command is Command.tests:
for k, v in self.projects.items():
subprocess.check_call([
sys.executable,
'-m',
'unittest',
'online.fxreader.pr34.tests.test_crypto',
*args,
], cwd=str(v.source_dir))
else:
raise NotImplementedError
if __name__ == '__main__':
CLI().run()

File diff suppressed because it is too large Load Diff

@ -0,0 +1,27 @@
__all__ = (
'parse_args',
)
import sys
import argparse
from typing import (Optional,)
def parse_args(
parser: argparse.ArgumentParser,
args: Optional[list[str]] = None,
) -> tuple[argparse.Namespace, list[str]]:
if args is None:
args = sys.argv[1:]
argv : list[str] = []
for i, o in enumerate(args):
if o == '--':
argv.extend(args[i + 1:])
del args[i:]
break
return parser.parse_args(args), argv

@ -0,0 +1,14 @@
import logging
import asyncio
from typing import (Any,)
logger = logging.getLogger(__name__)
def handle_task_result(fut: asyncio.Future[Any]) -> None:
try:
fut.result()
logger.debug(dict(fut=fut, msg='done'), stacklevel=2,)
except:
logger.exception('', stacklevel=2,)

@ -0,0 +1,473 @@
import dataclasses
import io
import glob
import os
import pathlib
import logging
import sys
import subprocess
import shutil
import abc
from .os import shutil_which
from typing import (
Optional,
Literal,
Any,
)
logger = logging.getLogger(__name__)
@dataclasses.dataclass
class Project:
source_dir : pathlib.Path
build_dir : pathlib.Path
dest_dir : pathlib.Path
meson_path: Optional[pathlib.Path] = None
@dataclasses.dataclass
class Dependency:
name: str
mode : Literal['pyproject', 'meson', 'meson-python', 'm']
source_path : pathlib.Path
args: Optional[list[str]] = None
@dataclasses.dataclass
class DistSettings:
wheel_dir : pathlib.Path
python_path: pathlib.Path
env_path: pathlib.Path
class CLI(abc.ABC):
@property
@abc.abstractmethod
def dist_settings(self) -> DistSettings:
raise NotImplementedError
@property
@abc.abstractmethod
def projects(self) -> dict[str, Project]:
raise NotImplementedError
@property
@abc.abstractmethod
def dependencies(self) -> dict[str, Dependency]:
raise NotImplementedError
def mypy(
self,
argv: list[str]
) -> None:
from . import mypy as _mypy
_mypy.run(
argv,
)
def ruff(
self,
project_name: str,
argv: list[str],
) -> None:
project = self.projects[project_name]
if len(argv) == 0:
argv = ['check', '.',]
subprocess.check_call([
self.dist_settings.python_path,
'-m',
'ruff',
'--config', str(project.source_dir / 'pyproject.toml'),
*argv,
])
def pyright(
self,
project_name: str,
argv: list[str],
) -> None:
project = self.projects[project_name]
if len(argv) == 0:
argv = ['--threads', '3', '.']
subprocess.check_call([
self.dist_settings.python_path,
'-m',
'pyright',
'-p', str(project.source_dir / 'pyproject.toml'),
*argv,
])
def pip_sync(
self,
project: str,
features: list[str],
) -> None:
from . import cli_bootstrap
pyproject = cli_bootstrap.pyproject_load(
self.projects[project].source_dir / 'pyproject.toml'
)
dependencies = sum([
pyproject.dependencies[o]
for o in features
], [])
pip_find_links : list[pathlib.Path] = []
if not pyproject.pip_find_links is None:
pip_find_links.extend(pyproject.pip_find_links)
logger.info(dict(
dependencies=dependencies,
))
if len(dependencies) > 0:
subprocess.check_call([
self.dist_settings.python_path,
'-m',
'uv', 'pip', 'install',
*sum([
['-f', str(o),]
for o in pip_find_links
], []),
# '-f', str(pathlib.Path(__file__).parent / 'deps' / 'dist'),
'--offline',
*dependencies,
])
def deploy_fetch_dist(
self,
force: bool,
) -> None:
for k, d in self.dependencies.items():
whl_glob = self.dist_settings.wheel_dir / ('*%s*.whl' % d.name.replace('.', '_'))
if len(glob.glob(
str(whl_glob)
)) == 0 or force:
if d.source_path.exists():
def whl_files_get() -> list[dict[str, Any]]:
return [
dict(
path=o,
stat=os.stat(o).st_mtime,
)
for o in glob.glob(
str(whl_glob)
)
]
present_files = whl_files_get()
if d.mode == 'm':
if (d.source_path / 'm.py').exists():
cmd = [
sys.executable,
str(d.source_path / 'm.py'),
'deploy:wheel',
'-o',
str(self.dist_settings.wheel_dir),
]
if not d.args is None:
cmd.extend(d.args)
subprocess.check_call(
cmd,
cwd=d.source_path,
)
else:
raise NotImplementedError
updated_files = whl_files_get()
def index_get(o: dict[str, Any]) -> tuple[Any, ...]:
return (o['path'], o['stat'])
present_files_index = {
index_get(o) : o
for o in present_files
}
new_files : list[dict[str, Any]] = []
for o in updated_files:
entry_index = index_get(o)
if not entry_index in present_files_index:
new_files.append(o)
if len(new_files) == 0:
raise NotImplementedError
latest_file = sorted(
new_files,
key=lambda x: x['stat']
)[-1]
subprocess.check_call([
self.dist_settings.python_path,
'-m', 'pip',
'install',
latest_file['path'],
])
@property
def pkg_config_path(self,) -> set[pathlib.Path]:
return {
pathlib.Path(o)
for o in glob.glob(
str(self.dist_settings.env_path / 'lib' / 'python*' / '**' / 'pkgconfig'),
recursive=True,
)
}
def deploy_wheel(
self,
project_name: str,
argv: Optional[list[str]] = None,
output_dir: Optional[pathlib.Path] = None,
force: Optional[bool] = None,
env: Optional[dict[str, str]] = None,
mypy: bool = False,
tests: bool = False,
) -> None:
project = self.projects[project_name]
# subprocess.check_call([
# sys.argv[0],
# # sys.executable,
# '-p', options.project,
# Command.meson_setup.value,
# ])
if argv is None:
argv = []
# assert argv is None or len(argv) == 0
if not project.meson_path is None:
if tests:
self.meson_test(
project_name=project_name,
)
self.meson_install(
project_name=project_name,
force=force,
)
if mypy:
self.mypy([])
if env is None:
env = dict()
extra_args: list[str] = []
if len(self.third_party_roots) > 0:
extra_args.extend([
'-Csetup-args=%s' % (
'-Dthird_party_roots=%s' % str(o.absolute())
)
for o in self.third_party_roots
])
cmd = [
sys.executable,
'-m',
'build',
'-w', '-n',
*extra_args,
'-Csetup-args=-Dmodes=pyproject',
'-Cbuild-dir=%s' % str(project.build_dir / 'pyproject'),
'-Csetup-args=-Dinstall_path=%s' % str(project.dest_dir),
# '-Cbuild-dir=%s' % str(project.build_dir),
str(project.source_dir),
*argv,
]
if not output_dir is None:
cmd.extend(['-o', str(output_dir)])
logger.info(dict(env=env))
subprocess.check_call(
cmd,
env=dict(list(os.environ.items())) | env,
)
if not project.meson_path is None:
if tests:
subprocess.check_call(
[
'ninja',
'-C',
str(project.build_dir / 'pyproject'),
'test',
]
)
def meson_install(
self,
project_name: str,
force: Optional[bool] = None,
argv: Optional[list[str]] = None,
) -> None:
project = self.projects[project_name]
if force is None:
force = False
if argv is None:
argv = []
if force and project.dest_dir.exists():
shutil.rmtree(project.dest_dir)
subprocess.check_call([
shutil_which('meson', True,),
'install',
'-C',
project.build_dir / 'meson',
'--destdir', project.dest_dir,
*argv,
])
for o in glob.glob(
str(project.dest_dir / 'lib' / 'pkgconfig' / '*.pc'),
recursive=True,
):
logger.info(dict(
path=o,
action='patch prefix',
))
with io.open(o, 'r') as f:
content = f.read()
with io.open(o, 'w') as f:
f.write(
content.replace('prefix=/', 'prefix=${pcfiledir}/../../')
)
def ninja(
self,
project_name: str,
argv: Optional[list[str]] = None,
env: Optional[dict[str, str]] = None,
) -> None:
project = self.projects[project_name]
if argv is None:
argv = []
if env is None:
env = dict()
logger.info(dict(env=env))
subprocess.check_call(
[
shutil_which('ninja', True),
'-C',
str(project.build_dir / 'meson'),
*argv,
],
env=dict(list(os.environ.items())) | env,
)
def meson_test(
self,
project_name: str,
argv: Optional[list[str]] = None,
) -> None:
project = self.projects[project_name]
if argv is None:
argv = []
subprocess.check_call([
shutil_which('meson', True,),
'test',
'-C',
project.build_dir / 'meson',
*argv,
])
def meson_compile(
self,
project_name: str,
argv: Optional[list[str]] = None,
) -> None:
project = self.projects[project_name]
if argv is None:
argv = []
subprocess.check_call([
shutil_which('meson', True,),
'compile',
'-C',
project.build_dir / 'meson',
*argv,
])
@property
def third_party_roots(self) -> list[pathlib.Path]:
return []
def meson_setup(
self,
project_name: str,
force: bool,
argv: Optional[list[str]] = None,
env: Optional[dict[str, str]] = None,
# third_party_roots: Optional[list[pathlib.Path]] = None,
) -> None:
project = self.projects[project_name]
if argv is None:
argv = []
if env is None:
env = dict()
logger.info(dict(env=env))
if force:
if (project.build_dir / 'meson').exists():
logger.info(dict(action='removing build dir', path=project.build_dir / 'meson'))
shutil.rmtree(project.build_dir / 'meson')
extra_args : list[str] = []
if len(self.third_party_roots) > 0:
extra_args.extend([
'-Dthird_party_roots=%s' % str(o.absolute())
for o in self.third_party_roots
])
cmd = [
shutil_which('meson', True,),
'setup',
str(project.source_dir),
str(project.build_dir / 'meson'),
'-Dmodes=["meson"]',
*extra_args,
# '-Dpkgconfig.relocatable=true',
'-Dprefix=/',
*argv,
]
logger.info(dict(cmd=cmd))
subprocess.check_call(
cmd,
env=dict(list(os.environ.items())) | env,
)

@ -0,0 +1,292 @@
#!/usr/bin/env python3
import glob
import io
import tempfile
import dataclasses
import pathlib
import sys
import subprocess
import os
import logging
import tomllib
from typing import (Self, Optional, Any,)
logger = logging.getLogger(__name__)
@dataclasses.dataclass
class PyProject:
path: pathlib.Path
dependencies: dict[str, list[str]]
early_features: Optional[list[str]] = None
pip_find_links: Optional[list[pathlib.Path]] = None
runtime_libdirs: Optional[list[pathlib.Path]] = None
runtime_preload: Optional[list[pathlib.Path]] = None
requirements: dict[str, pathlib.Path] = dataclasses.field(default_factory=lambda : dict())
def pyproject_load(
d: pathlib.Path,
) -> PyProject:
with io.open(d, 'rb') as f:
content = tomllib.load(f)
assert isinstance(content, dict)
dependencies : dict[str, list[str]] = dict()
dependencies['default'] = content['project']['dependencies']
if (
'optional-dependencies' in content['project']
):
assert isinstance(
content['project']['optional-dependencies'],
dict
)
for k, v in content['project']['optional-dependencies'].items():
assert isinstance(v, list)
assert isinstance(k, str)
dependencies[k] = v
res = PyProject(
path=d,
dependencies=dependencies,
)
tool_name = 'online.fxreader.pr34'.replace('.', '-')
if (
'tool' in content and
isinstance(
content['tool'], dict
) and
tool_name in content['tool'] and
isinstance(
content['tool'][tool_name],
dict
)
):
if 'early_features' in content['tool'][tool_name]:
res.early_features = content['tool'][tool_name]['early_features']
if 'pip_find_links' in content['tool'][tool_name]:
res.pip_find_links = [
d.parent / pathlib.Path(o)
for o in content['tool'][tool_name]['pip_find_links']
]
if 'runtime_libdirs' in content['tool'][tool_name]:
res.runtime_libdirs = [
d.parent / pathlib.Path(o)
# pathlib.Path(o)
for o in content['tool'][tool_name]['runtime_libdirs']
]
if 'runtime_preload' in content['tool'][tool_name]:
res.runtime_preload = [
d.parent / pathlib.Path(o)
# pathlib.Path(o)
for o in content['tool'][tool_name]['runtime_preload']
]
if 'requirements' in content['tool'][tool_name]:
assert isinstance(content['tool'][tool_name]['requirements'], dict)
res.requirements = {
k : d.parent / pathlib.Path(v)
# pathlib.Path(o)
for k, v in content['tool'][tool_name]['requirements'].items()
}
return res
@dataclasses.dataclass
class BootstrapSettings:
env_path: pathlib.Path
python_path: pathlib.Path
base_dir: pathlib.Path
uv_args: list[str] = dataclasses.field(
default_factory=lambda : os.environ.get(
'UV_ARGS',
'--offline',
).split(),
)
@classmethod
def get(
cls,
base_dir: Optional[pathlib.Path] = None,
) -> Self:
if base_dir is None:
base_dir = pathlib.Path.cwd()
env_path = base_dir / '.venv'
python_path = env_path / 'bin' / 'python3'
return cls(
base_dir=base_dir,
env_path=env_path,
python_path=python_path,
)
def env_bootstrap(
bootstrap_settings: BootstrapSettings,
pyproject: PyProject,
) -> None:
pip_find_links : list[pathlib.Path] = []
if not pyproject.pip_find_links is None:
pip_find_links.extend(pyproject.pip_find_links)
pip_find_links_args = sum([
['-f', str(o),]
for o in pip_find_links
], [])
features : list[str] = []
if pyproject.early_features:
features.extend(pyproject.early_features)
requirements_name = '_'.join(sorted(features))
requirements_path : Optional[pathlib.Path] = None
if requirements_name in pyproject.requirements:
requirements_path = pyproject.requirements[requirements_name]
else:
requirements_path = pyproject.path.parent / 'requirements.txt'
requirements_in : list[str] = []
requirements_in.extend([
'uv', 'pip', 'build', 'setuptools', 'meson-python', 'pybind11'
])
if pyproject.early_features:
early_dependencies = sum([
pyproject.dependencies[o]
for o in pyproject.early_features
], [])
logger.info(dict(
early_dependencies=early_dependencies,
))
requirements_in.extend(early_dependencies)
# if len(early_dependencies) > 0:
# subprocess.check_call([
# bootstrap_settings.python_path,
# '-m',
# 'uv', 'pip', 'install',
# *pip_find_links_args,
# # '-f', str(pathlib.Path(__file__).parent / 'deps' / 'dist'),
# *bootstrap_settings.uv_args,
# *early_dependencies,
# ])
if not requirements_path.exists():
with tempfile.NamedTemporaryFile(
mode='w',
prefix='requirements',
suffix='.in',
) as f:
f.write(
'\n'.join(requirements_in)
)
f.flush()
subprocess.check_call([
'uv',
'pip',
'compile',
'--generate-hashes',
*pip_find_links_args,
# '-p',
# bootstrap_settings.python_path,
*bootstrap_settings.uv_args,
'-o', str(requirements_path),
f.name,
])
subprocess.check_call([
'uv', 'venv',
*pip_find_links_args,
# '--seed',
*bootstrap_settings.uv_args,
str(bootstrap_settings.env_path)
])
subprocess.check_call([
'uv',
'pip',
'install',
*pip_find_links_args,
'-p',
bootstrap_settings.python_path,
'--require-hashes',
*bootstrap_settings.uv_args,
'-r', str(requirements_path),
])
def paths_equal(
a: pathlib.Path | str,
b: pathlib.Path | str
) -> bool:
return (
os.path.abspath(str(a)) ==
os.path.abspath(str(b))
)
def run(
d: Optional[pathlib.Path] = None,
cli_path: Optional[pathlib.Path] = None,
) -> None:
if cli_path is None:
cli_path = pathlib.Path(__file__).parent / 'cli.py'
if d is None:
d = pathlib.Path(__file__).parent / 'pyproject.toml'
bootstrap_settings = BootstrapSettings.get()
pyproject : PyProject = pyproject_load(
d
)
logging.basicConfig(level=logging.INFO)
if not bootstrap_settings.env_path.exists():
env_bootstrap(
bootstrap_settings=bootstrap_settings,
pyproject=pyproject,
)
logger.info([sys.executable, sys.argv, bootstrap_settings.python_path])
if not paths_equal(sys.executable, bootstrap_settings.python_path):
os.execv(
str(bootstrap_settings.python_path),
[
str(bootstrap_settings.python_path),
*sys.argv,
]
)
os.execv(
str(bootstrap_settings.python_path),
[
str(bootstrap_settings.python_path),
str(
cli_path
),
*sys.argv[1:],
]
)
if __name__ == '__main__':
run()

@ -0,0 +1,90 @@
import base64
import os
import cryptography.hazmat.primitives.kdf.scrypt
from typing import (Literal, overload, Optional,)
class PasswordUtils:
@overload
@classmethod
def secret_hash(
cls,
secret: str | bytes,
mode: Literal['base64'],
salt: Optional[bytes] = None,
) -> tuple[str, str]: ...
@overload
@classmethod
def secret_hash(
cls,
secret: str | bytes,
mode: Literal['bytes'],
salt: Optional[bytes] = None,
) -> tuple[bytes, bytes]: ...
@classmethod
def secret_hash(
cls,
secret: str | bytes,
mode: Literal['bytes', 'base64'],
salt: Optional[bytes] = None,
) -> tuple[str, str] | tuple[bytes, bytes]:
if salt is None:
salt = os.urandom(16)
if isinstance(secret, str):
secret = secret.encode('utf-8')
# derive
kdf = cls._scrypt_init(salt=salt)
hashed_secret = kdf.derive(secret)
if mode == 'bytes':
return (salt, hashed_secret)
elif mode == 'base64':
res_tuple = tuple((
base64.b64encode(o).decode('utf-8')
for o in (salt, hashed_secret,)
))
return (res_tuple[0], res_tuple[1])
else:
raise NotImplementedError
@classmethod
def _scrypt_init(
cls,
salt: bytes
) -> cryptography.hazmat.primitives.kdf.scrypt.Scrypt:
return cryptography.hazmat.primitives.kdf.scrypt.Scrypt(
salt=salt,
length=32,
n=2**14,
r=8,
p=1,
)
@classmethod
def secret_check(
cls,
secret: str | bytes,
salt: str | bytes,
hashed_secret: str | bytes,
) -> bool:
if isinstance(salt, str):
salt = base64.b64decode(salt)
if isinstance(secret, str):
secret = secret.encode('utf-8')
if isinstance(hashed_secret, str):
hashed_secret = base64.b64decode(hashed_secret)
kdf = cls._scrypt_init(salt=salt)
try:
kdf.verify(secret, hashed_secret)
return True
except cryptography.exceptions.InvalidKey:
return False

@ -0,0 +1,35 @@
import os
import logging
from typing import (Optional,)
logger = logging.getLogger(__name__)
class DebugPy:
@classmethod
def set_trace(
cls,
host: Optional[str] = None,
port: Optional[int] = None,
wait: Optional[bool] = None,
) -> None:
if host is None:
host = '127.0.0.1'
if port is None:
port = 4444
if wait is None:
wait = True
import debugpy
if os.environ.get('DEBUGPY_RUNNING') != 'true':
logger.info('debugpy init')
import debugpy
debugpy.listen((host, port))
os.environ['DEBUGPY_RUNNING'] = 'true'
if wait:
debugpy.wait_for_client()
debugpy.breakpoint()
logger.info('debugpy done')

@ -0,0 +1,16 @@
import logging
from typing import (Optional,)
def setup(level: Optional[int] = None) -> None:
if level is None:
level = logging.INFO
logging.basicConfig(
level=level,
format=(
'%(levelname)s:%(name)s:%(message)s'
':%(process)d'
':%(asctime)s'
':%(pathname)s:%(funcName)s:%(lineno)s'
),
)

@ -0,0 +1,216 @@
import pydantic.dataclasses
import datetime
import pydantic_settings
import marisa_trie
import json
import pathlib
import subprocess
import logging
import sys
import argparse
from pydantic import (Field,)
from typing import (ClassVar, Generator, Annotated, Optional, Any,)
logger = logging.getLogger(__name__)
@pydantic.dataclasses.dataclass
class MypyFormatEntry:
name : str
value : str
def __eq__(self, other: object) -> bool:
if not isinstance(other, type(self)):
raise NotImplementedError
return self.value == other.value
class MypyFormat:
vscode : ClassVar[MypyFormatEntry] = MypyFormatEntry(name='vscode', value='vscode')
json : ClassVar[MypyFormatEntry] = MypyFormatEntry(name='json', value='json')
@classmethod
def from_value(cls, value: str) -> MypyFormatEntry:
for e in cls.entries():
if value == e.value:
return e
raise NotImplementedError
@classmethod
def entries(cls) -> Generator[MypyFormatEntry, None, None,]:
for o in dir(cls):
e = getattr(cls, o)
if not isinstance(e, MypyFormatEntry):
continue
yield e
class MypySettings(pydantic_settings.BaseSettings):
model_config = pydantic_settings.SettingsConfigDict(
env_prefix='online_fxreader_pr34_mypy_',
case_sensitive=False,
)
config_path : pathlib.Path = pathlib.Path.cwd() / '.mypy.ini'
max_errors : dict[str, int] = dict()
paths : Annotated[list[pathlib.Path], Field(default_factory=lambda : ['.'])]
def run(
argv: Optional[list[str]] = None,
settings: Optional[MypySettings] = None,
) -> None:
if argv is None:
argv = []
if settings is None:
settings = MypySettings()
parser = argparse.ArgumentParser()
parser.add_argument(
'-q', '--quiet',
dest='quiet',
action='store_true',
help='do not print anything if the program is correct according to max_errors limits',
default=False,
)
parser.add_argument(
'-i',
dest='paths',
help='specify paths to check',
default=[],
action='append',
)
parser.add_argument(
'-f', '--format',
dest='_format',
help='output format of errors',
default=MypyFormat.json.value,
choices=[
o.value
for o in MypyFormat.entries()
],
)
options, args = parser.parse_known_args(argv)
if len(args) > 0 and args[0] == '--':
del args[0]
options.format = MypyFormat.from_value(options._format)
if len(options.paths) == 0:
options.paths.extend(settings.paths)
started_at = datetime.datetime.now()
mypy_cmd = [
sys.executable,
'-m',
'mypy',
'--config-file', str(settings.config_path),
'--strict',
'-O',
'json',
*args,
*options.paths,
]
logger.info(dict(cmd=mypy_cmd))
res = subprocess.run(
mypy_cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
done_at = datetime.datetime.now()
try:
assert not res.returncode is None
errors = sorted([
json.loads(o)
for o in res.stdout.decode('utf-8').splitlines()
if not o.strip() == ''
], key=lambda x: (
x.get('file', ''),
x.get('line', 0),
))
if not options.quiet:
if (len(res.stderr)) > 0:
logger.error(res.stderr.decode('utf-8'))
except:
logger.exception('')
logger.error(res.stdout.decode('utf-8'))
logger.error(res.stderr.decode('utf-8'))
sys.exit(res.returncode)
g : dict[str, Any] = dict()
for o in errors:
if not o['file'] in g:
g[o['file']] = []
g[o['file']].append(o)
h = {
k : len(v)
for k, v in sorted(
list(g.items()),
key=lambda x: x[0],
)
}
mentioned_paths = marisa_trie.Trie(list(h))
violated_limits : dict[str, str] = dict()
for k, v in settings.max_errors.items():
matching_paths = mentioned_paths.keys(k)
total_errors = sum([
h[o]
for o in matching_paths
], 0)
if total_errors > v:
violated_limits[k] = '%s - [%s]: has %d errors > %d' % (
k, ', '.join(matching_paths), total_errors, v,
)
if len(violated_limits) > 0 or not options.quiet:
if options.format == MypyFormat.vscode:
for o in errors:
sys.stdout.write('[%s] %s:%d,%d %s - %s - %s\n' % (
o['severity'],
o['file'],
o['line'],
o['column'],
o['message'],
o['hint'],
o['code'],
))
sys.stdout.flush()
#logger.info(json.dumps(errors, indent=4))
else:
logger.info(json.dumps(errors, indent=4))
#if len(violated_limits) > 0:
# logger.info(json.dumps(violated_limits, indent=4))
logger.info(json.dumps(dict(
max_errors=settings.max_errors,
violated_limits=violated_limits,
histogram=h,
elapsed=(done_at - started_at).total_seconds(),
), indent=4))
if len(violated_limits) > 0:
sys.exit(1)
if __name__ == '__main__':
from . import logging as _logging
_logging.setup()
run(sys.argv[1:])

@ -0,0 +1,92 @@
import shutil
import glob
import pathlib
import ctypes
import os
import sys
import logging
logger = logging.getLogger(__name__)
from typing import (overload, Optional, Literal,)
from .cli_bootstrap import PyProject
@overload
def shutil_which(
name: str,
raise_on_failure: Literal[True],
) -> str: ...
@overload
def shutil_which(
name: str,
raise_on_failure: bool,
) -> Optional[str]: ...
def shutil_which(
name: str,
raise_on_failure: bool,
) -> Optional[str]:
res = shutil.which(name)
if res is None and raise_on_failure:
raise NotImplementedError
else:
return res
def runtime_libdirs_init(
project: PyProject,
) -> None:
if sys.platform == 'linux':
ld_library_path : list[pathlib.Path] = [
o
for o in [
*[
o.absolute()
for o in (
project.runtime_libdirs
if project.runtime_libdirs
else []
)
],
*[
pathlib.Path(o)
for o in os.environ.get(
'LD_LIBRARY_PATH',
''
).split(os.path.pathsep)
if o != ''
]
]
]
ld_library_path_present : list[pathlib.Path] = []
for o in ld_library_path:
if not o.exists():
logger.warning(dict(
ld_library_path=o,
msg='not found',
))
ld_library_path_present.append(o)
os.environ.update(
LD_LIBRARY_PATH=os.path.pathsep.join([
str(o) for o in ld_library_path_present
])
)
for preload_path in (project.runtime_preload or []):
for preload_found in glob.glob(str(
preload_path.parent / ('lib%s.so' % preload_path.name)
)):
logger.info(dict(
preload_path=preload_path, preload_found=preload_found,
# lib_path=o,
msg='load_library',
))
ctypes.cdll.LoadLibrary(preload_found)
else:
raise NotImplementedError

@ -0,0 +1,524 @@
import contextlib
import pathlib
import sys
import enum
import dataclasses
import subprocess
import tempfile
import unittest.mock
import logging
import typing
if typing.TYPE_CHECKING:
import pip._internal.commands.show
import pip._internal.commands.download
import pip._internal.cli.main_parser
import pip._internal.models.index
import pip._internal.utils.temp_dir
import pip._internal.cli.main
import pip._internal.network.download
import pip._internal.resolution.base
import pip._internal.resolution.resolvelib.resolver
import pip._internal.operations.prepare
from typing import (
Literal, Optional, Iterable, Any,
)
logger = logging.getLogger(__name__)
def pip_show(
argv: list[str],
) -> list['pip._internal.commands.show._PackageInfo']:
import pip._internal.commands.show
return list(
pip._internal.commands.show.search_packages_info(
argv,
)
)
class pip_resolve_t:
class kwargs_t:
class mode_t(enum.StrEnum):
copy_paste = "copy_paste"
monkey_patch = "monkey_patch"
uv_pip_freeze = "uv_pip_freeze"
uv_pip_compile = "uv_pip_compile"
@dataclasses.dataclass
class res_t:
@dataclasses.dataclass
class download_info_t:
url: str
sha256: str
constraint: str
txt: Optional[str] = None
entries: Optional[list[download_info_t]] = None
def pip_resolve_entries_to_txt(
entries: list[pip_resolve_t.res_t.download_info_t]
) -> str:
return '\n'.join([
'#%s\n%s %s' % (
o.url,
o.constraint,
' '.join([
'--hash=sha256:%s' % o2
for o2 in o.sha256
])
)
for o in entries
])
def pip_resolve(
argv: list[str],
mode: pip_resolve_t.kwargs_t.mode_t,
requirements: Optional[list[str]] = None,
) -> pip_resolve_t.res_t:
if mode is pip_resolve_t.kwargs_t.mode_t.copy_paste:
import pip._internal.commands.show
import pip._internal.commands.download
import pip._internal.cli.cmdoptions
import pip._internal.cli.main_parser
import pip._internal.models.index
import pip._internal.utils.temp_dir
import pip._internal.cli.main
import pip._internal.network.download
import pip._internal.resolution.base
import pip._internal.req.req_install
import pip._internal.resolution.resolvelib.resolver
import pip._internal.operations.prepare
import pip._internal.utils.temp_dir
import pip._internal.operations.build.build_tracker
import pip._internal.models.direct_url
with contextlib.ExitStack() as stack:
stack.enter_context(pip._internal.utils.temp_dir.global_tempdir_manager())
t2 = pip._internal.cli.main_parser.create_main_parser()
t3 = t2.parse_args(["download"])
t1 = pip._internal.commands.download.DownloadCommand("blah", "shit")
stack.enter_context(t1.main_context())
# options = pip._internal.commands.download.Values()
options = t3[0]
options.python_version = None
options.platforms = []
options.abis = []
options.implementation = []
options.format_control = None
options.ignore_dependencies = None
options.index_url = pip._internal.models.index.PyPI.simple_url
options.extra_index_urls = []
options.no_index = None
options.find_links = []
options.pre = None
options.prefer_binary = True
options.only_binary = True
options.constraints = []
options.use_pep517 = None
options.editables = []
options.requirements = []
options.src_dir = str(pathlib.Path(__file__).parent)
options.build_isolation = None
options.check_build_deps = None
options.progress_bar = True
options.require_hashes = None
options.ignore_requires_python = False
# options.cache_dir
pip._internal.cli.cmdoptions.check_dist_restriction(options)
# t1._in_main_context = True
session = t1.get_default_session(options)
target_python = pip._internal.cli.cmdoptions.make_target_python(options)
finder = t1._build_package_finder(
options=options,
session=session,
target_python=target_python,
ignore_requires_python=options.ignore_requires_python,
)
build_tracker = t1.enter_context(
pip._internal.operations.build.build_tracker.get_build_tracker()
)
reqs = t1.get_requirements(
[
#'pip', 'uv', 'ipython',
*argv,
],
options,
finder,
session,
)
pip._internal.req.req_install.check_legacy_setup_py_options(options, reqs)
directory = pip._internal.utils.temp_dir.TempDirectory(
delete=True, kind="download", globally_managed=True
)
preparer = t1.make_requirement_preparer(
temp_build_dir=directory,
options=options,
build_tracker=build_tracker,
session=session,
finder=finder,
download_dir=None,
use_user_site=False,
verbosity=False,
)
resolver = t1.make_resolver(
preparer=preparer,
finder=finder,
options=options,
ignore_requires_python=options.ignore_requires_python,
use_pep517=options.use_pep517,
py_version_info=options.python_version,
)
t1.trace_basic_info(finder)
requirement_set = resolver.resolve(reqs, check_supported_wheels=True)
res = pip_resolve_t.res_t()
res.entries = []
for k, v in requirement_set.requirements.items():
assert not v.download_info is None
assert isinstance(
v.download_info.info,
pip._internal.models.direct_url.ArchiveInfo,
)
assert not v.download_info.info.hashes is None
res.entries.append(
pip_resolve_t.res_t.download_info_t(
constraint=k,
sha256=v.download_info.info.hashes["sha256"],
url=v.download_info.url,
)
)
res.txt = pip_resolve_entries_to_txt(
res.entries
)
return res
elif mode is pip_resolve_t.kwargs_t.mode_t.monkey_patch:
import pip._internal.commands.show
import pip._internal.commands.download
import pip._internal.cli.main_parser
import pip._internal.models.index
import pip._internal.models.link
from pip._internal.models.link import (
Link,
)
import pip._internal.utils.temp_dir
from pip._internal.metadata.base import (
BaseDistribution,
)
import pip._internal.cli.main
import pip._internal.network.download
import pip._internal.resolution.base
import pip._internal.resolution.resolvelib.resolver
import pip._internal.operations.prepare
from pip._internal.network.download import (
Downloader,
)
from pip._internal.operations.prepare import (
File,
)
from pip._internal.req.req_set import RequirementSet
from pip._internal.utils.hashes import Hashes
from pip._internal.req.req_install import InstallRequirement
downloader_call_def = pip._internal.network.download.Downloader.__call__
def downloader_call(
_self: pip._internal.network.download.Downloader,
link: pip._internal.models.link.Link,
location: str,
) -> tuple[str, str]:
logger.info(
dict(
url=link.url,
)
)
return downloader_call_def(
_self,
link, location,
)
batch_downloader_call_def = (
pip._internal.network.download.BatchDownloader.__call__
)
def batch_downloader_call(
_self: pip._internal.network.download.BatchDownloader,
links: Iterable[pip._internal.models.link.Link],
location: str,
) -> Iterable[
tuple[
pip._internal.models.link.Link,
tuple[str, str]
]
]:
# print(args)
logger.info(
dict(
links=links,
location=location,
)
)
return [
(o, ("/dev/null", ''))
for o in links
]
# base_resolver_resolve_def = pip._internal.resolution.base.BaseResolver.resolve
base_resolver_resolve_def = (
pip._internal.resolution.resolvelib.resolver.Resolver.resolve
)
result_requirements : list[
RequirementSet | InstallRequirement
] = []
def base_resolver_resolve(
_self: pip._internal.resolution.resolvelib.resolver.Resolver,
root_reqs: list[
InstallRequirement,
],
check_supported_wheels: bool,
) -> RequirementSet:
# print(args, kwargs)
res = base_resolver_resolve_def(
_self,
root_reqs,
check_supported_wheels
)
result_requirements.append(res)
raise NotImplementedError
return res
get_http_url_def = pip._internal.operations.prepare.get_http_url
def get_http_url(
link: Link,
download: Downloader,
download_dir: Optional[str] = None,
hashes: Optional[Hashes] = None,
) -> File:
logger.info(
dict(
url=link.url,
hashes=hashes,
)
)
if link.url.endswith(".whl"):
print("blah")
hashes = None
return File(
"/dev/null",
'',
)
else:
return get_http_url_def(
link,
download,
download_dir,
hashes
)
prepare_linked_requirements_more_def = pip._internal.operations.prepare.RequirementPreparer.prepare_linked_requirements_more
def prepare_linked_requirements_more(
_self: pip._internal.resolution.resolvelib.resolver.Resolver,
reqs: Iterable[InstallRequirement],
parallel_builds: bool = False,
) -> None:
result_requirements.extend(
reqs
)
raise NotImplementedError
_complete_partial_requirements_def = pip._internal.operations.prepare.RequirementPreparer._complete_partial_requirements
def _complete_partial_requirements(
_self: pip._internal.resolution.resolvelib.resolver.Resolver,
partially_downloaded_reqs: Iterable[InstallRequirement],
parallel_builds: bool = False,
) -> None:
result_requirements.extend(
partially_downloaded_reqs
)
raise NotImplementedError
patches : list[Any] = []
patches.append(
unittest.mock.patch.object(
pip._internal.network.download.Downloader, "__call__", downloader_call
)
)
# patches.append(
# unittest.mock.patch.object(
# pip._internal.network.download.BatchDownloader,
# '__call__',
# batch_downloader_call
# )
# )
# patches.append(
# unittest.mock.patch.object(
# pip._internal.resolution.base.BaseResolver, 'resolve', base_resolver_resolve))
patches.append(
unittest.mock.patch.object(
pip._internal.resolution.resolvelib.resolver.Resolver,
"resolve",
base_resolver_resolve,
)
)
patches.append(
unittest.mock.patch.object(
pip._internal.operations.prepare,
"get_http_url",
get_http_url,
)
)
patches.append(
unittest.mock.patch.object(
pip._internal.operations.prepare.RequirementPreparer,
"prepare_linked_requirements_more",
prepare_linked_requirements_more,
)
)
# patches.append(
# unittest.mock.patch.object(
# pip._internal.operations.prepare.RequirementPreparer,
# '_complete_partial_requirements',
# _complete_partial_requirements
# )
# )
with contextlib.ExitStack() as stack:
for p in patches:
stack.enter_context(p)
pip._internal.cli.main.main(
[
"download",
"-q",
"--no-cache",
"-d",
"/dev/null",
*argv,
# 'numpy',
]
)
# return sum([
# [
# pip_resolve_t.res_t.download_info_t(
# constraint=k,
# sha256=v.download_info.info.hashes['sha256'],
# url=v.download_info.url,
# )
# for k, v in o.requirements.items()
# ]
# for o in result_requirements
# ], [])
logger.warn(result_requirements)
res = pip_resolve_t.res_t()
res.entries = []
for o in result_requirements:
assert isinstance(o, InstallRequirement)
sha256_hashes = o.hashes()._allowed["sha256"]
assert len(sha256_hashes) == 1
assert not o.link is None
res.entries.append(
pip_resolve_t.res_t.download_info_t(
constraint=str(o.req),
sha256=sha256_hashes[0],
url=o.link.url,
)
)
res.txt = pip_resolve_entries_to_txt(
res.entries
)
return res
elif mode is pip_resolve_t.kwargs_t.mode_t.uv_pip_freeze:
assert len(argv) == 0
pip_freeze = subprocess.check_output(
[
sys.executable,
"-m",
"uv",
"pip",
"freeze",
],
).decode('utf-8')
pip_compile = subprocess.check_output(
[
sys.executable, '-m',
'uv', 'pip', 'compile',
'--generate-hashes',
'-',
],
input=pip_freeze.encode('utf-8')
).decode('utf-8')
return pip_resolve_t.res_t(
txt=pip_compile,
)
elif mode is pip_resolve_t.kwargs_t.mode_t.uv_pip_compile:
with contextlib.ExitStack() as stack:
if not requirements is None:
# assert len(argv) == 0
f = stack.enter_context(
tempfile.NamedTemporaryFile(
suffix='.txt',
)
)
f.write(
('\n'.join(requirements)).encode('utf-8')
)
f.flush()
argv.append(f.name)
if argv[0] == '--':
del argv[0]
pip_compile = subprocess.check_output(
[
sys.executable, '-m',
'uv', 'pip', 'compile',
'--generate-hashes',
*argv,
],
).decode('utf-8')
return pip_resolve_t.res_t(
txt=pip_compile,
)
else:
raise NotImplementedError

@ -0,0 +1,27 @@
# https://github.com/python/typing/issues/59#issuecomment-353878355
# https://gitea.fxreader.online/fxreader.online/freelance-project-34-marketing-blog/issues/2#issue-25
import typing
from typing import Any
from typing_extensions import Protocol
from abc import abstractmethod
C = typing.TypeVar("C", bound="Comparable")
class Comparable(Protocol):
@abstractmethod
def __eq__(self, other: Any) -> bool:
pass
@abstractmethod
def __lt__(self: C, other: C) -> bool:
pass
def __gt__(self: C, other: C) -> bool:
return (not self < other) and self != other
def __le__(self: C, other: C) -> bool:
return self < other or self == other
def __ge__(self: C, other: C) -> bool:
return (not self < other)

@ -0,0 +1,277 @@
import time
import glob
import io
import os
import numpy
import numpy.typing
import functools
import pathlib
import threading
import cython
import datetime
from typing import (Any, Optional, TypeVar, Type, cast)
# from scoping import scoping as s
def test(
_id: int,
T: float,
a: numpy.ndarray[Any, numpy.dtype[numpy.int32]],
) -> None:
with cython.nogil:
#if True:
started_at = datetime.datetime.now()
print('started')
def elapsed() -> float:
return (datetime.datetime.now() - started_at).total_seconds()
#a = 0
while elapsed() < T:
#a += 1
for k in range(1024 * 1024):
a[_id] += 1
print(['done', started_at, elapsed(), a[_id]])
M = TypeVar('M', bound=Type[Any])
def build(content: str, module: M) -> M:
import pathlib
import tempfile
import hashlib
import Cython.Build.Inline
sha256sum = hashlib.sha256(content.encode('utf-8')).digest().hex()
output_dir = (pathlib.Path('.') / 'tmp' / 'cython' / sha256sum).absolute()
if not output_dir.exists() or True:
os.makedirs(str(output_dir), exist_ok=True)
source_path = output_dir / ('_%s.pyx' % sha256sum)
if not source_path.exists():
with io.open(str(source_path), 'w') as f:
f.write(content)
t1 = Cython.Build.Inline._get_build_extension()
t1.extensions = Cython.Build.cythonize(str(source_path))
t1.build_temp = str(pathlib.Path('/'))
t1.build_lib = str(output_dir)
#t2 = Cython.Build.Inline.Extension(
# name=sha256sum,
#)
t1.run()
return cast(
M,
Cython.Build.Inline.load_dynamic(
'_%s' % sha256sum,
glob.glob(
str(output_dir / ('_%s*.so' % sha256sum))
)[0]
)
)
raise NotImplementedError
def mypyc_build(file_path: pathlib.Path) -> Any:
import pathlib
import tempfile
import hashlib
import mypyc.build
import Cython.Build.Inline
assert isinstance(file_path, pathlib.Path)
#sha256sum = hashlib.sha256(content.encode('utf-8')).digest().hex()
#output_dir = (pathlib.Path('.') / 'tmp' / 'cython' / sha256sum).absolute()
output_dir = pathlib.Path('.') / 'tmp' / 'mypyc'
sha256sum = file_path.stem
lib_pattern = file_path.parent / ('%s.cpython*.so' % sha256sum)
lib_dir = pathlib.Path('.')
def lib_path_glob(path: str | pathlib.Path) -> Optional[pathlib.Path]:
res : list[str] = glob.glob(str(path))
if len(res) == 0:
return None
else:
return pathlib.Path(res[0])
need_build : bool = False
lib_path : Optional[pathlib.Path] = None
lib_path = lib_path_glob(lib_pattern)
if not lib_path is None:
t2 = file_path.stat()
t3 = lib_path.stat()
if t3.st_mtime < t2.st_mtime:
need_build = True
del t2
del t3
else:
need_build = True
if need_build:
for o in [
output_dir,
output_dir / 'build' / file_path.parent,
]:
os.makedirs(
str(o),
exist_ok=True
)
#source_path = output_dir / ('_%s.py' % sha256sum)
source_path = file_path
#with io.open(str(source_path), 'w') as f:
# f.write(content)
t1 = Cython.Build.Inline._get_build_extension()
t1.extensions = mypyc.build.mypycify(
[str(source_path)],
target_dir=str(output_dir / 'build')
)
t1.build_temp = str(output_dir)
t1.build_lib = str(lib_dir)
#t2 = Cython.Build.Inline.Extension(
# name=sha256sum,
#)
t1.run()
lib_path = lib_path_glob(lib_pattern)
return Cython.Build.Inline.load_dynamic(
#'_%s' % sha256sum,
#t1.extensions[0].name,
file_path.stem,
str(lib_path),
)
raise NotImplementedError
class Source:
@staticmethod
def test2(
_a : numpy.ndarray[Any, numpy.dtype[numpy.int64]],
_id : numpy.dtype[numpy.int32] | int,
T : float=16
) -> int:
raise NotImplementedError
source = build(r'''
cimport cython
@cython.boundscheck(False)
@cython.wraparound(False)
def test4(int[:] a, int[:] b):
cdef int N = a.shape[0]
assert N == b.shape[0]
with cython.nogil:
for i in range(N):
a[i] += b[i]
return N
import datetime
def elapsed(started_at: datetime.datetime):
res = (datetime.datetime.now() - started_at).total_seconds()
return res
@cython.boundscheck(False) # Deactivate bounds checking
@cython.wraparound(False) # Deactivate negative indexing.
def has_time(started_at: datetime.datetime, T: float):
t1 = elapsed(started_at)
res = t1 < T
return res
@cython.boundscheck(False)
@cython.wraparound(False)
def test2(long long [:] _a, int _id, double T=16) -> int:
started_at = datetime.datetime.now()
print('started')
cdef int C = 1;
cdef int cond;
with cython.nogil:
#if True:
#a = 0
while True:
with cython.gil:
cond = has_time(started_at, T)
#cond = 0
if cond != 1:
break
#a += 1
for k in range(1024 * 1024 * 1024):
_a[_id] += C
print(['done', started_at, elapsed(started_at), _a[_id]])
return _a[_id]
''', Source)
def test_cython(N: int=4, T:int=16) -> None:
#a = [0] * N
a = numpy.zeros((N,), dtype=numpy.int64)
t = [
threading.Thread(
target=functools.partial(
source.test2,
a,
k,
T,
)
)
for k in range(N)
]
for o in t:
o.start()
for o in t:
o.join()
#cython_module['test2'](a, 0)
def test_mypyc(N: int=4, W:int=35) -> None:
cython2 = mypyc_build(
(pathlib.Path(__file__).parent / 'cython2.py').relative_to(
pathlib.Path.cwd()
)
)
# from .cython2 import fib
#a = [0] * N
t = [
threading.Thread(
target=functools.partial(
cython2.fib,
W,
)
)
for k in range(N)
]
for o in t:
o.start()
for o in t:
o.join()

@ -0,0 +1,11 @@
import time
def fib(n: int) -> int:
if n <= 1:
return n
else:
return fib(n - 2) + fib(n - 1)
t0 = time.time()
fib(32)
print(time.time() - t0)

@ -0,0 +1,36 @@
from online.fxreader.pr34.commands_typed import crypto
import unittest
class TestCrypto(unittest.TestCase):
def test_password_utils(self) -> None:
salt = b'asdfasdfasdf'
secret = 'blah'
hash_res = crypto.PasswordUtils.secret_hash(
secret,
mode='bytes',
salt=salt,
)
self.assertEqual(
hash_res,
(
salt,
b'\xdak\xd15\xfa\x8e\xc8\r\xc3\xd2c\xf1m\xb0\xbf\xe6\x98\x01$!j\xc8\xc0Hh\x84\xea,\x91\x8b\x08\xce',
),
)
check_res = crypto.PasswordUtils.secret_check(
secret,
*hash_res,
)
self.assertTrue(check_res)
self.assertFalse(
crypto.PasswordUtils.secret_check(
secret + 'asdfasdfsdf',
*hash_res,
)
)

53
python/pyproject.toml Normal file

@ -0,0 +1,53 @@
[project]
name = 'online.fxreader.pr34'
version = '0.1.5.11'
dependencies = [
#"-r requirements.txt",
'mypy',
'marisa-trie',
'pydantic',
'pydantic-settings',
]
[project.optional-dependencies]
crypto = [
'cryptography',
]
early = [
'numpy',
'cryptography',
]
[tool.online-fxreader-pr34]
early_features = ['default', 'early',]
[build-system]
requires = ['setuptools']
build-backend = 'setuptools.build_meta'
[tool.setuptools]
include-package-data = false
[tool.setuptools.package-dir]
'online.fxreader.pr34' = 'online/fxreader/pr34'
#package_dir = '..'
#packages = ['online_fxreader']
#[tool.setuptools.packages.find]
#where = ['../..']
#include = ['../../online_fxreader/vpn']
#exclude =['../../aiortc/*', '../../_cffi_src/*']
#[tool.setuptools.packages.find]
#exclude = ['*']
#include = ['*.py']
# [tool.setuptools.exclude-package-data]
# 'online.fxreader.pr34' = ['online/fxreader/pr34/py.typed']
#[tool.setuptools.package-data]
#'online_fxreader.vpn' = ['requirements.txt']
[project.scripts]
online-fxreader-pr34-commands = 'online.fxreader.pr34.commands:commands_cli'

477
python/requirements.txt Normal file

@ -0,0 +1,477 @@
# This file was autogenerated by uv via the following command:
# uv pip compile --generate-hashes -o /home/nartes/Documents/current/freelance-project-34-marketing-blog/python/requirements.txt /tmp/requirementsax9awtnk.in
annotated-types==0.7.0 \
--hash=sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53 \
--hash=sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89
# via pydantic
build==1.2.2.post1 \
--hash=sha256:1d61c0887fa860c01971625baae8bdd338e517b836a2f70dd1f7aa3a6b2fc5b5 \
--hash=sha256:b36993e92ca9375a219c99e606a122ff365a760a2d4bba0caa09bd5278b608b7
# via -r /tmp/requirementsax9awtnk.in
cffi==1.17.1 \
--hash=sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8 \
--hash=sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2 \
--hash=sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1 \
--hash=sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15 \
--hash=sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36 \
--hash=sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824 \
--hash=sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8 \
--hash=sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36 \
--hash=sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17 \
--hash=sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf \
--hash=sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc \
--hash=sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3 \
--hash=sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed \
--hash=sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702 \
--hash=sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1 \
--hash=sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8 \
--hash=sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903 \
--hash=sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6 \
--hash=sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d \
--hash=sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b \
--hash=sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e \
--hash=sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be \
--hash=sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c \
--hash=sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683 \
--hash=sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9 \
--hash=sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c \
--hash=sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8 \
--hash=sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1 \
--hash=sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4 \
--hash=sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655 \
--hash=sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67 \
--hash=sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595 \
--hash=sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0 \
--hash=sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65 \
--hash=sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41 \
--hash=sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6 \
--hash=sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401 \
--hash=sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6 \
--hash=sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3 \
--hash=sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16 \
--hash=sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93 \
--hash=sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e \
--hash=sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4 \
--hash=sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964 \
--hash=sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c \
--hash=sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576 \
--hash=sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0 \
--hash=sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3 \
--hash=sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662 \
--hash=sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3 \
--hash=sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff \
--hash=sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5 \
--hash=sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd \
--hash=sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f \
--hash=sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5 \
--hash=sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14 \
--hash=sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d \
--hash=sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9 \
--hash=sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7 \
--hash=sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382 \
--hash=sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a \
--hash=sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e \
--hash=sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a \
--hash=sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4 \
--hash=sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99 \
--hash=sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87 \
--hash=sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b
# via cryptography
cryptography==44.0.2 \
--hash=sha256:04abd71114848aa25edb28e225ab5f268096f44cf0127f3d36975bdf1bdf3390 \
--hash=sha256:0529b1d5a0105dd3731fa65680b45ce49da4d8115ea76e9da77a875396727b41 \
--hash=sha256:1bc312dfb7a6e5d66082c87c34c8a62176e684b6fe3d90fcfe1568de675e6688 \
--hash=sha256:268e4e9b177c76d569e8a145a6939eca9a5fec658c932348598818acf31ae9a5 \
--hash=sha256:29ecec49f3ba3f3849362854b7253a9f59799e3763b0c9d0826259a88efa02f1 \
--hash=sha256:2bf7bf75f7df9715f810d1b038870309342bff3069c5bd8c6b96128cb158668d \
--hash=sha256:3b721b8b4d948b218c88cb8c45a01793483821e709afe5f622861fc6182b20a7 \
--hash=sha256:3c00b6b757b32ce0f62c574b78b939afab9eecaf597c4d624caca4f9e71e7843 \
--hash=sha256:3dc62975e31617badc19a906481deacdeb80b4bb454394b4098e3f2525a488c5 \
--hash=sha256:4973da6ca3db4405c54cd0b26d328be54c7747e89e284fcff166132eb7bccc9c \
--hash=sha256:4e389622b6927d8133f314949a9812972711a111d577a5d1f4bee5e58736b80a \
--hash=sha256:51e4de3af4ec3899d6d178a8c005226491c27c4ba84101bfb59c901e10ca9f79 \
--hash=sha256:5f6f90b72d8ccadb9c6e311c775c8305381db88374c65fa1a68250aa8a9cb3a6 \
--hash=sha256:6210c05941994290f3f7f175a4a57dbbb2afd9273657614c506d5976db061181 \
--hash=sha256:6f101b1f780f7fc613d040ca4bdf835c6ef3b00e9bd7125a4255ec574c7916e4 \
--hash=sha256:7bdcd82189759aba3816d1f729ce42ffded1ac304c151d0a8e89b9996ab863d5 \
--hash=sha256:7ca25849404be2f8e4b3c59483d9d3c51298a22c1c61a0e84415104dacaf5562 \
--hash=sha256:81276f0ea79a208d961c433a947029e1a15948966658cf6710bbabb60fcc2639 \
--hash=sha256:8cadc6e3b5a1f144a039ea08a0bdb03a2a92e19c46be3285123d32029f40a922 \
--hash=sha256:8e0ddd63e6bf1161800592c71ac794d3fb8001f2caebe0966e77c5234fa9efc3 \
--hash=sha256:909c97ab43a9c0c0b0ada7a1281430e4e5ec0458e6d9244c0e821bbf152f061d \
--hash=sha256:96e7a5e9d6e71f9f4fca8eebfd603f8e86c5225bb18eb621b2c1e50b290a9471 \
--hash=sha256:9a1e657c0f4ea2a23304ee3f964db058c9e9e635cc7019c4aa21c330755ef6fd \
--hash=sha256:9eb9d22b0a5d8fd9925a7764a054dca914000607dff201a24c791ff5c799e1fa \
--hash=sha256:af4ff3e388f2fa7bff9f7f2b31b87d5651c45731d3e8cfa0944be43dff5cfbdb \
--hash=sha256:b042d2a275c8cee83a4b7ae30c45a15e6a4baa65a179a0ec2d78ebb90e4f6699 \
--hash=sha256:bc821e161ae88bfe8088d11bb39caf2916562e0a2dc7b6d56714a48b784ef0bb \
--hash=sha256:c505d61b6176aaf982c5717ce04e87da5abc9a36a5b39ac03905c4aafe8de7aa \
--hash=sha256:c63454aa261a0cf0c5b4718349629793e9e634993538db841165b3df74f37ec0 \
--hash=sha256:c7362add18b416b69d58c910caa217f980c5ef39b23a38a0880dfd87bdf8cd23 \
--hash=sha256:d03806036b4f89e3b13b6218fefea8d5312e450935b1a2d55f0524e2ed7c59d9 \
--hash=sha256:d1b3031093a366ac767b3feb8bcddb596671b3aaff82d4050f984da0c248b615 \
--hash=sha256:d1c3572526997b36f245a96a2b1713bf79ce99b271bbcf084beb6b9b075f29ea \
--hash=sha256:efcfe97d1b3c79e486554efddeb8f6f53a4cdd4cf6086642784fa31fc384e1d7 \
--hash=sha256:f514ef4cd14bb6fb484b4a60203e912cfcb64f2ab139e88c2274511514bf7308
# via -r /tmp/requirementsax9awtnk.in
marisa-trie==1.2.1 \
--hash=sha256:06b099dd743676dbcd8abd8465ceac8f6d97d8bfaabe2c83b965495523b4cef2 \
--hash=sha256:0ee6cf6a16d9c3d1c94e21c8e63c93d8b34bede170ca4e937e16e1c0700d399f \
--hash=sha256:0fe69fb9ffb2767746181f7b3b29bbd3454d1d24717b5958e030494f3d3cddf3 \
--hash=sha256:1db3213b451bf058d558f6e619bceff09d1d130214448a207c55e1526e2773a1 \
--hash=sha256:20948e40ab2038e62b7000ca6b4a913bc16c91a2c2e6da501bd1f917eeb28d51 \
--hash=sha256:2428b495003c189695fb91ceeb499f9fcced3a2dce853e17fa475519433c67ff \
--hash=sha256:24a81aa7566e4ec96fc4d934581fe26d62eac47fc02b35fa443a0bb718b471e8 \
--hash=sha256:25688f34cac3bec01b4f655ffdd6c599a01f0bd596b4a79cf56c6f01a7df3560 \
--hash=sha256:36aa4401a1180615f74d575571a6550081d84fc6461e9aefc0bb7b2427af098e \
--hash=sha256:3a27c408e2aefc03e0f1d25b2ff2afb85aac3568f6fa2ae2a53b57a2e87ce29d \
--hash=sha256:3ad356442c2fea4c2a6f514738ddf213d23930f942299a2b2c05df464a00848a \
--hash=sha256:429858a0452a7bedcf67bc7bb34383d00f666c980cb75a31bcd31285fbdd4403 \
--hash=sha256:436f62d27714970b9cdd3b3c41bdad046f260e62ebb0daa38125ef70536fc73b \
--hash=sha256:46e528ee71808c961baf8c3ce1c46a8337ec7a96cc55389d11baafe5b632f8e9 \
--hash=sha256:4728ed3ae372d1ea2cdbd5eaa27b8f20a10e415d1f9d153314831e67d963f281 \
--hash=sha256:536ea19ce6a2ce61c57fed4123ecd10d18d77a0db45cd2741afff2b8b68f15b3 \
--hash=sha256:5685a14b3099b1422c4f59fa38b0bf4b5342ee6cc38ae57df9666a0b28eeaad3 \
--hash=sha256:594f98491a96c7f1ffe13ce292cef1b4e63c028f0707effdea0f113364c1ae6c \
--hash=sha256:5bd39a4e1cc839a88acca2889d17ebc3f202a5039cd6059a13148ce75c8a6244 \
--hash=sha256:5e43891a37b0d7f618819fea14bd951289a0a8e3dd0da50c596139ca83ebb9b1 \
--hash=sha256:5e649f3dc8ab5476732094f2828cc90cac3be7c79bc0c8318b6fda0c1d248db4 \
--hash=sha256:5fe5a286f997848a410eebe1c28657506adaeb405220ee1e16cfcfd10deb37f2 \
--hash=sha256:638506eacf20ca503fff72221a7e66a6eadbf28d6a4a6f949fcf5b1701bb05ec \
--hash=sha256:6532615111eec2c79e711965ece0bc95adac1ff547a7fff5ffca525463116deb \
--hash=sha256:66b23e5b35dd547f85bf98db7c749bc0ffc57916ade2534a6bbc32db9a4abc44 \
--hash=sha256:6704adf0247d2dda42e876b793be40775dff46624309ad99bc7537098bee106d \
--hash=sha256:67f0c2ec82c20a02c16fc9ba81dee2586ef20270127c470cb1054767aa8ba310 \
--hash=sha256:6946100a43f933fad6bc458c502a59926d80b321d5ac1ed2ff9c56605360496f \
--hash=sha256:6c50c861faad0a5c091bd763e0729f958c316e678dfa065d3984fbb9e4eacbcd \
--hash=sha256:735c363d9aaac82eaf516a28f7c6b95084c2e176d8231c87328dc80e112a9afa \
--hash=sha256:746a7c60a17fccd3cfcfd4326926f02ea4fcdfc25d513411a0c4fc8e4a1ca51f \
--hash=sha256:7ac170d20b97beb75059ba65d1ccad6b434d777c8992ab41ffabdade3b06dd74 \
--hash=sha256:7cca7f96236ffdbf49be4b2e42c132e3df05968ac424544034767650913524de \
--hash=sha256:7e7b1786e852e014d03e5f32dbd991f9a9eb223dd3fa9a2564108b807e4b7e1c \
--hash=sha256:852d7bcf14b0c63404de26e7c4c8d5d65ecaeca935e93794331bc4e2f213660b \
--hash=sha256:875a6248e60fbb48d947b574ffa4170f34981f9e579bde960d0f9a49ea393ecc \
--hash=sha256:8951e7ce5d3167fbd085703b4cbb3f47948ed66826bef9a2173c379508776cf5 \
--hash=sha256:8cf4f25cf895692b232f49aa5397af6aba78bb679fb917a05fce8d3cb1ee446d \
--hash=sha256:952af3a5859c3b20b15a00748c36e9eb8316eb2c70bd353ae1646da216322908 \
--hash=sha256:98042040d1d6085792e8d0f74004fc0f5f9ca6091c298f593dd81a22a4643854 \
--hash=sha256:9c9b32b14651a6dcf9e8857d2df5d29d322a1ea8c0be5c8ffb88f9841c4ec62b \
--hash=sha256:9e956e6a46f604b17d570901e66f5214fb6f658c21e5e7665deace236793cef6 \
--hash=sha256:9f627f4e41be710b6cb6ed54b0128b229ac9d50e2054d9cde3af0fef277c23cf \
--hash=sha256:a2eb41d2f9114d8b7bd66772c237111e00d2bae2260824560eaa0a1e291ce9e8 \
--hash=sha256:a3c98613180cf1730e221933ff74b454008161b1a82597e41054127719964188 \
--hash=sha256:a4177dc0bd1374e82be9b2ba4d0c2733b0a85b9d154ceeea83a5bee8c1e62fbf \
--hash=sha256:a8443d116c612cfd1961fbf76769faf0561a46d8e317315dd13f9d9639ad500c \
--hash=sha256:aa7cd17e1c690ce96c538b2f4aae003d9a498e65067dd433c52dd069009951d4 \
--hash=sha256:ad548117744b2bcf0e3d97374608be0a92d18c2af13d98b728d37cd06248e571 \
--hash=sha256:aefe0973cc4698e0907289dc0517ab0c7cdb13d588201932ff567d08a50b0e2e \
--hash=sha256:b0ef26733d3c836be79e812071e1a431ce1f807955a27a981ebb7993d95f842b \
--hash=sha256:b1ce340da608530500ab4f963f12d6bfc8d8680900919a60dbdc9b78c02060a4 \
--hash=sha256:b1ec93f0d1ee6d7ab680a6d8ea1a08bf264636358e92692072170032dda652ba \
--hash=sha256:b2a7d00f53f4945320b551bccb826b3fb26948bde1a10d50bb9802fabb611b10 \
--hash=sha256:b2eacb84446543082ec50f2fb563f1a94c96804d4057b7da8ed815958d0cdfbe \
--hash=sha256:b5ea16e69bfda0ac028c921b58de1a4aaf83d43934892977368579cd3c0a2554 \
--hash=sha256:bd45142501300e7538b2e544905580918b67b1c82abed1275fe4c682c95635fa \
--hash=sha256:c0fe2ace0cb1806badbd1c551a8ec2f8d4cf97bf044313c082ef1acfe631ddca \
--hash=sha256:c484410911182457a8a1a0249d0c09c01e2071b78a0a8538cd5f7fa45589b13a \
--hash=sha256:ce37d8ca462bb64cc13f529b9ed92f7b21fe8d1f1679b62e29f9cb7d0e888b49 \
--hash=sha256:ce59bcd2cda9bb52b0e90cc7f36413cd86c3d0ce7224143447424aafb9f4aa48 \
--hash=sha256:d2a82eb21afdaf22b50d9b996472305c05ca67fc4ff5a026a220320c9c961db6 \
--hash=sha256:d5648c6dcc5dc9200297fb779b1663b8a4467bda034a3c69bd9c32d8afb33b1d \
--hash=sha256:d659fda873d8dcb2c14c2c331de1dee21f5a902d7f2de7978b62c6431a8850ef \
--hash=sha256:d7eb20bf0e8b55a58d2a9b518aabc4c18278787bdba476c551dd1c1ed109e509 \
--hash=sha256:da4e4facb79614cc4653cfd859f398e4db4ca9ab26270ff12610e50ed7f1f6c6 \
--hash=sha256:de1665eaafefa48a308e4753786519888021740501a15461c77bdfd57638e6b4 \
--hash=sha256:e2699255d7ac610dee26d4ae7bda5951d05c7d9123a22e1f7c6a6f1964e0a4e4 \
--hash=sha256:e58788004adda24c401d1751331618ed20c507ffc23bfd28d7c0661a1cf0ad16 \
--hash=sha256:e70869737cc0e5bd903f620667da6c330d6737048d1f44db792a6af68a1d35be \
--hash=sha256:eba6ca45500ca1a042466a0684aacc9838e7f20fe2605521ee19f2853062798f \
--hash=sha256:ed3fb4ed7f2084597e862bcd56c56c5529e773729a426c083238682dba540e98 \
--hash=sha256:f2806f75817392cedcacb24ac5d80b0350dde8d3861d67d045c1d9b109764114 \
--hash=sha256:f35c2603a6be168088ed1db6ad1704b078aa8f39974c60888fbbced95dcadad4 \
--hash=sha256:f4cd800704a5fc57e53c39c3a6b0c9b1519ebdbcb644ede3ee67a06eb542697d \
--hash=sha256:f713af9b8aa66a34cd3a78c7d150a560a75734713abe818a69021fd269e927fa
# via -r /tmp/requirementsax9awtnk.in
meson==1.7.0 \
--hash=sha256:08efbe84803eed07f863b05092d653a9d348f7038761d900412fddf56deb0284 \
--hash=sha256:ae3f12953045f3c7c60e27f2af1ad862f14dee125b4ed9bcb8a842a5080dbf85
# via meson-python
meson-python==0.17.1 \
--hash=sha256:30a75c52578ef14aff8392677b09c39346e0a24d2b2c6204b8ed30583c11269c \
--hash=sha256:efb91f69f2e19eef7bc9a471ed2a4e730088cc6b39eacaf3e49fc4f930eb5f83
# via -r /tmp/requirementsax9awtnk.in
mypy==1.15.0 \
--hash=sha256:1124a18bc11a6a62887e3e137f37f53fbae476dc36c185d549d4f837a2a6a14e \
--hash=sha256:171a9ca9a40cd1843abeca0e405bc1940cd9b305eaeea2dda769ba096932bb22 \
--hash=sha256:1905f494bfd7d85a23a88c5d97840888a7bd516545fc5aaedff0267e0bb54e2f \
--hash=sha256:1fbb8da62dc352133d7d7ca90ed2fb0e9d42bb1a32724c287d3c76c58cbaa9c2 \
--hash=sha256:2922d42e16d6de288022e5ca321cd0618b238cfc5570e0263e5ba0a77dbef56f \
--hash=sha256:2e2c2e6d3593f6451b18588848e66260ff62ccca522dd231cd4dd59b0160668b \
--hash=sha256:2ee2d57e01a7c35de00f4634ba1bbf015185b219e4dc5909e281016df43f5ee5 \
--hash=sha256:2f2147ab812b75e5b5499b01ade1f4a81489a147c01585cda36019102538615f \
--hash=sha256:404534629d51d3efea5c800ee7c42b72a6554d6c400e6a79eafe15d11341fd43 \
--hash=sha256:5469affef548bd1895d86d3bf10ce2b44e33d86923c29e4d675b3e323437ea3e \
--hash=sha256:5a95fb17c13e29d2d5195869262f8125dfdb5c134dc8d9a9d0aecf7525b10c2c \
--hash=sha256:6983aae8b2f653e098edb77f893f7b6aca69f6cffb19b2cc7443f23cce5f4828 \
--hash=sha256:712e962a6357634fef20412699a3655c610110e01cdaa6180acec7fc9f8513ba \
--hash=sha256:8023ff13985661b50a5928fc7a5ca15f3d1affb41e5f0a9952cb68ef090b31ee \
--hash=sha256:811aeccadfb730024c5d3e326b2fbe9249bb7413553f15499a4050f7c30e801d \
--hash=sha256:8f8722560a14cde92fdb1e31597760dc35f9f5524cce17836c0d22841830fd5b \
--hash=sha256:93faf3fdb04768d44bf28693293f3904bbb555d076b781ad2530214ee53e3445 \
--hash=sha256:973500e0774b85d9689715feeffcc980193086551110fd678ebe1f4342fb7c5e \
--hash=sha256:979e4e1a006511dacf628e36fadfecbcc0160a8af6ca7dad2f5025529e082c13 \
--hash=sha256:98b7b9b9aedb65fe628c62a6dc57f6d5088ef2dfca37903a7d9ee374d03acca5 \
--hash=sha256:aea39e0583d05124836ea645f412e88a5c7d0fd77a6d694b60d9b6b2d9f184fd \
--hash=sha256:b9378e2c00146c44793c98b8d5a61039a048e31f429fb0eb546d93f4b000bedf \
--hash=sha256:baefc32840a9f00babd83251560e0ae1573e2f9d1b067719479bfb0e987c6357 \
--hash=sha256:be68172e9fd9ad8fb876c6389f16d1c1b5f100ffa779f77b1fb2176fcc9ab95b \
--hash=sha256:c43a7682e24b4f576d93072216bf56eeff70d9140241f9edec0c104d0c515036 \
--hash=sha256:c4bb0e1bd29f7d34efcccd71cf733580191e9a264a2202b0239da95984c5b559 \
--hash=sha256:c7be1e46525adfa0d97681432ee9fcd61a3964c2446795714699a998d193f1a3 \
--hash=sha256:c9817fa23833ff189db061e6d2eff49b2f3b6ed9856b4a0a73046e41932d744f \
--hash=sha256:ce436f4c6d218a070048ed6a44c0bbb10cd2cc5e272b29e7845f6a2f57ee4464 \
--hash=sha256:d10d994b41fb3497719bbf866f227b3489048ea4bbbb5015357db306249f7980 \
--hash=sha256:e601a7fa172c2131bff456bb3ee08a88360760d0d2f8cbd7a75a65497e2df078 \
--hash=sha256:f95579473af29ab73a10bada2f9722856792a36ec5af5399b653aa28360290a5
# via -r /tmp/requirementsax9awtnk.in
mypy-extensions==1.0.0 \
--hash=sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d \
--hash=sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782
# via mypy
numpy==2.2.4 \
--hash=sha256:05c076d531e9998e7e694c36e8b349969c56eadd2cdcd07242958489d79a7286 \
--hash=sha256:0d54974f9cf14acf49c60f0f7f4084b6579d24d439453d5fc5805d46a165b542 \
--hash=sha256:11c43995255eb4127115956495f43e9343736edb7fcdb0d973defd9de14cd84f \
--hash=sha256:188dcbca89834cc2e14eb2f106c96d6d46f200fe0200310fc29089657379c58d \
--hash=sha256:1974afec0b479e50438fc3648974268f972e2d908ddb6d7fb634598cdb8260a0 \
--hash=sha256:1cf4e5c6a278d620dee9ddeb487dc6a860f9b199eadeecc567f777daace1e9e7 \
--hash=sha256:207a2b8441cc8b6a2a78c9ddc64d00d20c303d79fba08c577752f080c4007ee3 \
--hash=sha256:218f061d2faa73621fa23d6359442b0fc658d5b9a70801373625d958259eaca3 \
--hash=sha256:2aad3c17ed2ff455b8eaafe06bcdae0062a1db77cb99f4b9cbb5f4ecb13c5146 \
--hash=sha256:2fa8fa7697ad1646b5c93de1719965844e004fcad23c91228aca1cf0800044a1 \
--hash=sha256:31504f970f563d99f71a3512d0c01a645b692b12a63630d6aafa0939e52361e6 \
--hash=sha256:3387dd7232804b341165cedcb90694565a6015433ee076c6754775e85d86f1fc \
--hash=sha256:4ba5054787e89c59c593a4169830ab362ac2bee8a969249dc56e5d7d20ff8df9 \
--hash=sha256:4f92084defa704deadd4e0a5ab1dc52d8ac9e8a8ef617f3fbb853e79b0ea3592 \
--hash=sha256:65ef3468b53269eb5fdb3a5c09508c032b793da03251d5f8722b1194f1790c00 \
--hash=sha256:6f527d8fdb0286fd2fd97a2a96c6be17ba4232da346931d967a0630050dfd298 \
--hash=sha256:7051ee569db5fbac144335e0f3b9c2337e0c8d5c9fee015f259a5bd70772b7e8 \
--hash=sha256:7716e4a9b7af82c06a2543c53ca476fa0b57e4d760481273e09da04b74ee6ee2 \
--hash=sha256:79bd5f0a02aa16808fcbc79a9a376a147cc1045f7dfe44c6e7d53fa8b8a79392 \
--hash=sha256:7a4e84a6283b36632e2a5b56e121961f6542ab886bc9e12f8f9818b3c266bfbb \
--hash=sha256:8120575cb4882318c791f839a4fd66161a6fa46f3f0a5e613071aae35b5dd8f8 \
--hash=sha256:81413336ef121a6ba746892fad881a83351ee3e1e4011f52e97fba79233611fd \
--hash=sha256:8146f3550d627252269ac42ae660281d673eb6f8b32f113538e0cc2a9aed42b9 \
--hash=sha256:879cf3a9a2b53a4672a168c21375166171bc3932b7e21f622201811c43cdd3b0 \
--hash=sha256:892c10d6a73e0f14935c31229e03325a7b3093fafd6ce0af704be7f894d95687 \
--hash=sha256:92bda934a791c01d6d9d8e038363c50918ef7c40601552a58ac84c9613a665bc \
--hash=sha256:9ba03692a45d3eef66559efe1d1096c4b9b75c0986b5dff5530c378fb8331d4f \
--hash=sha256:9eeea959168ea555e556b8188da5fa7831e21d91ce031e95ce23747b7609f8a4 \
--hash=sha256:a0258ad1f44f138b791327961caedffbf9612bfa504ab9597157806faa95194a \
--hash=sha256:a761ba0fa886a7bb33c6c8f6f20213735cb19642c580a931c625ee377ee8bd39 \
--hash=sha256:a7b9084668aa0f64e64bd00d27ba5146ef1c3a8835f3bd912e7a9e01326804c4 \
--hash=sha256:a84eda42bd12edc36eb5b53bbcc9b406820d3353f1994b6cfe453a33ff101775 \
--hash=sha256:ab2939cd5bec30a7430cbdb2287b63151b77cf9624de0532d629c9a1c59b1d5c \
--hash=sha256:ac0280f1ba4a4bfff363a99a6aceed4f8e123f8a9b234c89140f5e894e452ecd \
--hash=sha256:adf8c1d66f432ce577d0197dceaac2ac00c0759f573f28516246351c58a85020 \
--hash=sha256:b4adfbbc64014976d2f91084915ca4e626fbf2057fb81af209c1a6d776d23e3d \
--hash=sha256:bb649f8b207ab07caebba230d851b579a3c8711a851d29efe15008e31bb4de24 \
--hash=sha256:bce43e386c16898b91e162e5baaad90c4b06f9dcbe36282490032cec98dc8ae7 \
--hash=sha256:bd3ad3b0a40e713fc68f99ecfd07124195333f1e689387c180813f0e94309d6f \
--hash=sha256:c3f7ac96b16955634e223b579a3e5798df59007ca43e8d451a0e6a50f6bfdfba \
--hash=sha256:cf28633d64294969c019c6df4ff37f5698e8326db68cc2b66576a51fad634880 \
--hash=sha256:d0f35b19894a9e08639fd60a1ec1978cb7f5f7f1eace62f38dd36be8aecdef4d \
--hash=sha256:db1f1c22173ac1c58db249ae48aa7ead29f534b9a948bc56828337aa84a32ed6 \
--hash=sha256:dbe512c511956b893d2dacd007d955a3f03d555ae05cfa3ff1c1ff6df8851854 \
--hash=sha256:df2f57871a96bbc1b69733cd4c51dc33bea66146b8c63cacbfed73eec0883017 \
--hash=sha256:e2f085ce2e813a50dfd0e01fbfc0c12bbe5d2063d99f8b29da30e544fb6483b8 \
--hash=sha256:e642d86b8f956098b564a45e6f6ce68a22c2c97a04f5acd3f221f57b8cb850ae \
--hash=sha256:e9e0a277bb2eb5d8a7407e14688b85fd8ad628ee4e0c7930415687b6564207a4 \
--hash=sha256:ea2bb7e2ae9e37d96835b3576a4fa4b3a97592fbea8ef7c3587078b0068b8f09 \
--hash=sha256:ee4d528022f4c5ff67332469e10efe06a267e32f4067dc76bb7e2cddf3cd25ff \
--hash=sha256:f05d4198c1bacc9124018109c5fba2f3201dbe7ab6e92ff100494f236209c960 \
--hash=sha256:f34dc300df798742b3d06515aa2a0aee20941c13579d7a2f2e10af01ae4901ee \
--hash=sha256:f4162988a360a29af158aeb4a2f4f09ffed6a969c9776f8f3bdee9b06a8ab7e5 \
--hash=sha256:f486038e44caa08dbd97275a9a35a283a8f1d2f0ee60ac260a1790e76660833c \
--hash=sha256:f7de08cbe5551911886d1ab60de58448c6df0f67d9feb7d1fb21e9875ef95e91
# via -r /tmp/requirementsax9awtnk.in
packaging==24.2 \
--hash=sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759 \
--hash=sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f
# via
# build
# meson-python
# pyproject-metadata
pip==25.0.1 \
--hash=sha256:88f96547ea48b940a3a385494e181e29fb8637898f88d88737c5049780f196ea \
--hash=sha256:c46efd13b6aa8279f33f2864459c8ce587ea6a1a59ee20de055868d8f7688f7f
# via -r /tmp/requirementsax9awtnk.in
pybind11==2.13.6 \
--hash=sha256:237c41e29157b962835d356b370ededd57594a26d5894a795960f0047cb5caf5 \
--hash=sha256:ba6af10348c12b24e92fa086b39cfba0eff619b61ac77c406167d813b096d39a
# via -r /tmp/requirementsax9awtnk.in
pycparser==2.22 \
--hash=sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6 \
--hash=sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc
# via cffi
pydantic==2.11.1 \
--hash=sha256:442557d2910e75c991c39f4b4ab18963d57b9b55122c8b2a9cd176d8c29ce968 \
--hash=sha256:5b6c415eee9f8123a14d859be0c84363fec6b1feb6b688d6435801230b56e0b8
# via
# -r /tmp/requirementsax9awtnk.in
# pydantic-settings
pydantic-core==2.33.0 \
--hash=sha256:024d136ae44d233e6322027bbf356712b3940bee816e6c948ce4b90f18471b3d \
--hash=sha256:0310524c833d91403c960b8a3cf9f46c282eadd6afd276c8c5edc617bd705dc9 \
--hash=sha256:07b4ced28fccae3f00626eaa0c4001aa9ec140a29501770a88dbbb0966019a86 \
--hash=sha256:085d8985b1c1e48ef271e98a658f562f29d89bda98bf120502283efbc87313eb \
--hash=sha256:0a98257451164666afafc7cbf5fb00d613e33f7e7ebb322fbcd99345695a9a61 \
--hash=sha256:0bcf0bab28995d483f6c8d7db25e0d05c3efa5cebfd7f56474359e7137f39856 \
--hash=sha256:138d31e3f90087f42aa6286fb640f3c7a8eb7bdae829418265e7e7474bd2574b \
--hash=sha256:14229c1504287533dbf6b1fc56f752ce2b4e9694022ae7509631ce346158de11 \
--hash=sha256:1583539533160186ac546b49f5cde9ffc928062c96920f58bd95de32ffd7bffd \
--hash=sha256:175ab598fb457a9aee63206a1993874badf3ed9a456e0654273e56f00747bbd6 \
--hash=sha256:1a69b7596c6603afd049ce7f3835bcf57dd3892fc7279f0ddf987bebed8caa5a \
--hash=sha256:1a73be93ecef45786d7d95b0c5e9b294faf35629d03d5b145b09b81258c7cd6d \
--hash=sha256:1b1262b912435a501fa04cd213720609e2cefa723a07c92017d18693e69bf00b \
--hash=sha256:1b2ea72dea0825949a045fa4071f6d5b3d7620d2a208335207793cf29c5a182d \
--hash=sha256:20d4275f3c4659d92048c70797e5fdc396c6e4446caf517ba5cad2db60cd39d3 \
--hash=sha256:23c3e77bf8a7317612e5c26a3b084c7edeb9552d645742a54a5867635b4f2453 \
--hash=sha256:26a4ea04195638dcd8c53dadb545d70badba51735b1594810e9768c2c0b4a5da \
--hash=sha256:26bc7367c0961dec292244ef2549afa396e72e28cc24706210bd44d947582c59 \
--hash=sha256:2a0147c0bef783fd9abc9f016d66edb6cac466dc54a17ec5f5ada08ff65caf5d \
--hash=sha256:2c0afd34f928383e3fd25740f2050dbac9d077e7ba5adbaa2227f4d4f3c8da5c \
--hash=sha256:30369e54d6d0113d2aa5aee7a90d17f225c13d87902ace8fcd7bbf99b19124db \
--hash=sha256:31860fbda80d8f6828e84b4a4d129fd9c4535996b8249cfb8c720dc2a1a00bb8 \
--hash=sha256:34e7fb3abe375b5c4e64fab75733d605dda0f59827752debc99c17cb2d5f3276 \
--hash=sha256:40eb8af662ba409c3cbf4a8150ad32ae73514cd7cb1f1a2113af39763dd616b3 \
--hash=sha256:41d698dcbe12b60661f0632b543dbb119e6ba088103b364ff65e951610cb7ce0 \
--hash=sha256:4726f1f3f42d6a25678c67da3f0b10f148f5655813c5aca54b0d1742ba821b8f \
--hash=sha256:4927564be53239a87770a5f86bdc272b8d1fbb87ab7783ad70255b4ab01aa25b \
--hash=sha256:4b6d77c75a57f041c5ee915ff0b0bb58eabb78728b69ed967bc5b780e8f701b8 \
--hash=sha256:4d9149e7528af8bbd76cc055967e6e04617dcb2a2afdaa3dea899406c5521faa \
--hash=sha256:4deac83a8cc1d09e40683be0bc6d1fa4cde8df0a9bf0cda5693f9b0569ac01b6 \
--hash=sha256:4f1ab031feb8676f6bd7c85abec86e2935850bf19b84432c64e3e239bffeb1ec \
--hash=sha256:502ed542e0d958bd12e7c3e9a015bce57deaf50eaa8c2e1c439b512cb9db1e3a \
--hash=sha256:5461934e895968655225dfa8b3be79e7e927e95d4bd6c2d40edd2fa7052e71b6 \
--hash=sha256:58c1151827eef98b83d49b6ca6065575876a02d2211f259fb1a6b7757bd24dd8 \
--hash=sha256:5bdd36b362f419c78d09630cbaebc64913f66f62bda6d42d5fbb08da8cc4f181 \
--hash=sha256:5bf637300ff35d4f59c006fff201c510b2b5e745b07125458a5389af3c0dff8c \
--hash=sha256:5bf68bb859799e9cec3d9dd8323c40c00a254aabb56fe08f907e437005932f2b \
--hash=sha256:5d8dc9f63a26f7259b57f46a7aab5af86b2ad6fbe48487500bb1f4b27e051e4c \
--hash=sha256:5f36afd0d56a6c42cf4e8465b6441cf546ed69d3a4ec92724cc9c8c61bd6ecf4 \
--hash=sha256:5f72914cfd1d0176e58ddc05c7a47674ef4222c8253bf70322923e73e14a4ac3 \
--hash=sha256:6291797cad239285275558e0a27872da735b05c75d5237bbade8736f80e4c225 \
--hash=sha256:62c151ce3d59ed56ebd7ce9ce5986a409a85db697d25fc232f8e81f195aa39a1 \
--hash=sha256:635702b2fed997e0ac256b2cfbdb4dd0bf7c56b5d8fba8ef03489c03b3eb40e2 \
--hash=sha256:64672fa888595a959cfeff957a654e947e65bbe1d7d82f550417cbd6898a1d6b \
--hash=sha256:68504959253303d3ae9406b634997a2123a0b0c1da86459abbd0ffc921695eac \
--hash=sha256:69297418ad644d521ea3e1aa2e14a2a422726167e9ad22b89e8f1130d68e1e9a \
--hash=sha256:6c32a40712e3662bebe524abe8abb757f2fa2000028d64cc5a1006016c06af43 \
--hash=sha256:715c62af74c236bf386825c0fdfa08d092ab0f191eb5b4580d11c3189af9d330 \
--hash=sha256:71dffba8fe9ddff628c68f3abd845e91b028361d43c5f8e7b3f8b91d7d85413e \
--hash=sha256:7419241e17c7fbe5074ba79143d5523270e04f86f1b3a0dff8df490f84c8273a \
--hash=sha256:759871f00e26ad3709efc773ac37b4d571de065f9dfb1778012908bcc36b3a73 \
--hash=sha256:7a25493320203005d2a4dac76d1b7d953cb49bce6d459d9ae38e30dd9f29bc9c \
--hash=sha256:7b79af799630af263eca9ec87db519426d8c9b3be35016eddad1832bac812d87 \
--hash=sha256:7c9c84749f5787781c1c45bb99f433402e484e515b40675a5d121ea14711cf61 \
--hash=sha256:7da333f21cd9df51d5731513a6d39319892947604924ddf2e24a4612975fb936 \
--hash=sha256:82a4eba92b7ca8af1b7d5ef5f3d9647eee94d1f74d21ca7c21e3a2b92e008358 \
--hash=sha256:89670d7a0045acb52be0566df5bc8b114ac967c662c06cf5e0c606e4aadc964b \
--hash=sha256:8a1d581e8cdbb857b0e0e81df98603376c1a5c34dc5e54039dcc00f043df81e7 \
--hash=sha256:8ec86b5baa36f0a0bfb37db86c7d52652f8e8aa076ab745ef7725784183c3fdd \
--hash=sha256:91301a0980a1d4530d4ba7e6a739ca1a6b31341252cb709948e0aca0860ce0ae \
--hash=sha256:918f2013d7eadea1d88d1a35fd4a1e16aaf90343eb446f91cb091ce7f9b431a2 \
--hash=sha256:9cb2390355ba084c1ad49485d18449b4242da344dea3e0fe10babd1f0db7dcfc \
--hash=sha256:9ee65f0cc652261744fd07f2c6e6901c914aa6c5ff4dcfaf1136bc394d0dd26b \
--hash=sha256:a608a75846804271cf9c83e40bbb4dab2ac614d33c6fd5b0c6187f53f5c593ef \
--hash=sha256:a66d931ea2c1464b738ace44b7334ab32a2fd50be023d863935eb00f42be1778 \
--hash=sha256:a7a7f2a3f628d2f7ef11cb6188bcf0b9e1558151d511b974dfea10a49afe192b \
--hash=sha256:abaeec1be6ed535a5d7ffc2e6c390083c425832b20efd621562fbb5bff6dc518 \
--hash=sha256:abfa44cf2f7f7d7a199be6c6ec141c9024063205545aa09304349781b9a125e6 \
--hash=sha256:ade5dbcf8d9ef8f4b28e682d0b29f3008df9842bb5ac48ac2c17bc55771cc976 \
--hash=sha256:ae62032ef513fe6281ef0009e30838a01057b832dc265da32c10469622613885 \
--hash=sha256:aec79acc183865bad120b0190afac467c20b15289050648b876b07777e67ea48 \
--hash=sha256:b716294e721d8060908dbebe32639b01bfe61b15f9f57bcc18ca9a0e00d9520b \
--hash=sha256:b9ec80eb5a5f45a2211793f1c4aeddff0c3761d1c70d684965c1807e923a588b \
--hash=sha256:ba95691cf25f63df53c1d342413b41bd7762d9acb425df8858d7efa616c0870e \
--hash=sha256:bccc06fa0372151f37f6b69834181aa9eb57cf8665ed36405fb45fbf6cac3bae \
--hash=sha256:c860773a0f205926172c6644c394e02c25421dc9a456deff16f64c0e299487d3 \
--hash=sha256:ca1103d70306489e3d006b0f79db8ca5dd3c977f6f13b2c59ff745249431a606 \
--hash=sha256:ce72d46eb201ca43994303025bd54d8a35a3fc2a3495fac653d6eb7205ce04f4 \
--hash=sha256:d20cbb9d3e95114325780f3cfe990f3ecae24de7a2d75f978783878cce2ad585 \
--hash=sha256:dcfebee69cd5e1c0b76a17e17e347c84b00acebb8dd8edb22d4a03e88e82a207 \
--hash=sha256:e1c69aa459f5609dec2fa0652d495353accf3eda5bdb18782bc5a2ae45c9273a \
--hash=sha256:e2762c568596332fdab56b07060c8ab8362c56cf2a339ee54e491cd503612c50 \
--hash=sha256:e37f10f6d4bc67c58fbd727108ae1d8b92b397355e68519f1e4a7babb1473442 \
--hash=sha256:e790954b5093dff1e3a9a2523fddc4e79722d6f07993b4cd5547825c3cbf97b5 \
--hash=sha256:e81a295adccf73477220e15ff79235ca9dcbcee4be459eb9d4ce9a2763b8386c \
--hash=sha256:e925819a98318d17251776bd3d6aa9f3ff77b965762155bdad15d1a9265c4cfd \
--hash=sha256:ea30239c148b6ef41364c6f51d103c2988965b643d62e10b233b5efdca8c0099 \
--hash=sha256:eabf946a4739b5237f4f56d77fa6668263bc466d06a8036c055587c130a46f7b \
--hash=sha256:ecb158fb9b9091b515213bed3061eb7deb1d3b4e02327c27a0ea714ff46b0760 \
--hash=sha256:ecc6d02d69b54a2eb83ebcc6f29df04957f734bcf309d346b4f83354d8376862 \
--hash=sha256:eddb18a00bbb855325db27b4c2a89a4ba491cd6a0bd6d852b225172a1f54b36c \
--hash=sha256:f00e8b59e1fc8f09d05594aa7d2b726f1b277ca6155fc84c0396db1b373c4555 \
--hash=sha256:f1fb026c575e16f673c61c7b86144517705865173f3d0907040ac30c4f9f5915 \
--hash=sha256:f200b2f20856b5a6c3a35f0d4e344019f805e363416e609e9b47c552d35fd5ea \
--hash=sha256:f225f3a3995dbbc26affc191d0443c6c4aa71b83358fd4c2b7d63e2f6f0336f9 \
--hash=sha256:f22dab23cdbce2005f26a8f0c71698457861f97fc6318c75814a50c75e87d025 \
--hash=sha256:f3eb479354c62067afa62f53bb387827bee2f75c9c79ef25eef6ab84d4b1ae3b \
--hash=sha256:fc53e05c16697ff0c1c7c2b98e45e131d4bfb78068fffff92a82d169cbb4c7b7 \
--hash=sha256:ff48a55be9da6930254565ff5238d71d5e9cd8c5487a191cb85df3bdb8c77365
# via pydantic
pydantic-settings==2.8.1 \
--hash=sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c \
--hash=sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585
# via -r /tmp/requirementsax9awtnk.in
pyproject-hooks==1.2.0 \
--hash=sha256:1e859bd5c40fae9448642dd871adf459e5e2084186e8d2c2a79a824c970da1f8 \
--hash=sha256:9e5c6bfa8dcc30091c74b0cf803c81fdd29d94f01992a7707bc97babb1141913
# via build
pyproject-metadata==0.9.1 \
--hash=sha256:b8b2253dd1b7062b78cf949a115f02ba7fa4114aabe63fa10528e9e1a954a816 \
--hash=sha256:ee5efde548c3ed9b75a354fc319d5afd25e9585fa918a34f62f904cc731973ad
# via meson-python
python-dotenv==1.1.0 \
--hash=sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5 \
--hash=sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d
# via pydantic-settings
setuptools==78.1.0 \
--hash=sha256:18fd474d4a82a5f83dac888df697af65afa82dec7323d09c3e37d1f14288da54 \
--hash=sha256:3e386e96793c8702ae83d17b853fb93d3e09ef82ec62722e61da5cd22376dcd8
# via
# -r /tmp/requirementsax9awtnk.in
# marisa-trie
typing-extensions==4.13.0 \
--hash=sha256:0a4ac55a5820789d87e297727d229866c9650f6521b64206413c4fbada24d95b \
--hash=sha256:c8dd92cc0d6425a97c18fbb9d1954e5ff92c1ca881a309c45f06ebc0b79058e5
# via
# mypy
# pydantic
# pydantic-core
# typing-inspection
typing-inspection==0.4.0 \
--hash=sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f \
--hash=sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122
# via pydantic
uv==0.6.11 \
--hash=sha256:0222e801be2e923787118f7cf2d0530dec016666ce07729215055aac24898899 \
--hash=sha256:11e979e95a0cfed1b6de96d16f772aa027359642953fa329c7fcf88e7468170a \
--hash=sha256:15356a53c4bca7f3d1cbf15321a864ed97b58da16c666261f5869115f87fcb04 \
--hash=sha256:1a861070b06e749c4e53e5e370df6f337ac567e017fe6c10d5b9a48cc6317033 \
--hash=sha256:43af64b66ea1d26ce6c7836caaf61e5d44b3d2025cdee3fc1b8f33a5067cb6fa \
--hash=sha256:473dbe1d3fb53b08090f2c0f05ca46c85b1eadd134496338e167b2cc94e0aae2 \
--hash=sha256:5d4125fddfd42c7691309cc16f2c7f3be6dbcdb61804596756188223f459a25e \
--hash=sha256:5dcf349398a45778fda7677723270158e6c6596e9a49b85f8d7146c50823dd2d \
--hash=sha256:6b4248984a30843fbff8eedc846bb64ba3cebb417378fd04bc29e7273eeca022 \
--hash=sha256:6f3c2adb80f0b93ad312daff7ebb1bf4b26456d7d35a1687827ed03f11d238d7 \
--hash=sha256:7dc1df62322c03c2aee465e4f3f28ef7b3c6a948c2c79fde85651d8bccfc303c \
--hash=sha256:90ed73a0f4b4930fe310b9dc0c6348da37ec5c6026ddcd750868b516ad6abace \
--hash=sha256:94cf32e334329ec2db1704706c6ffb7b58edafb95a63499440327f1df2e0d30e \
--hash=sha256:aed91b596d85277342c9252288d8e3e2b39111b28ac947caa1c540c903797496 \
--hash=sha256:b8c43a85ca2aa429b13a5abde9abae2c58e0189ebc76d7d214b94e667f7d44e2 \
--hash=sha256:b9aebfd86bb9f8e4ed9a7d89d65b8b84ccd6d98061e1ebbaa96f8e6ee4632f10 \
--hash=sha256:baeb89b285f6a16bce390d81005b877c89600b0bfe1fc24895984efd117fa597 \
--hash=sha256:c0a657830850e4ade1d2185ac66dad39015d0bd17c555c274eb00a73e0644fc4
# via -r /tmp/requirementsax9awtnk.in

BIN
releases/tar/dotfiles-0.1.tar.xz (Stored with Git LFS) Normal file

Binary file not shown.

BIN
releases/whl/online_fxreader_pr34-0.1.4.13-py3-none-any.whl (Stored with Git LFS) Normal file

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More