Compare commits

..

10 commits

Author SHA1 Message Date
ae1e7e5aa6
dtmt: Add word extraction algorithm for paths 2024-07-17 09:29:41 +02:00
6ada4c1c43
sdk: Add additional brute force prefixes 2024-07-17 09:29:39 +02:00
6449354714
sdk: Reimplement logging current word 2024-07-17 09:29:37 +02:00
b366185a63
sdk: Implement worker pool for word generation
Massive speed improvement. The index generation is really fast,
and it appears that even worker numbers way higher than the core/thread
count still increase the throughput slightly.

The only missing part is the info output. That's broken, currently.
2024-07-17 09:29:21 +02:00
951a7f82c0
sdk: Improve word generation 2024-07-17 09:28:01 +02:00
4480144d92
sdk: Implement guessing a list of hashes
While the approach to generate and store a list of strings does allow
for this list to be re-used in the future, the I/O involved turned out
to be quite costly.

While the generation can run at up to 500 MiB/s, even compressing that
on the fly doesn't reach fast enough write speeds on a HDD.
And compression is also necessary to store this amount of data
(generation reached two TB of raw data with a word length of just three,
which is still 600 GB compressed).
But compression also makes working with that data a lot harder.

So this instead combines both the generation and search into a single
step. The intermediate result of the generation is therefore lost,
but the overall pipeline is much faster.
2024-07-17 09:27:59 +02:00
0d1193a126
sdk: Improve word generation throughput
It seems that the simple `println!()` is really bad when the goal
is to write a lot of data to stdout.
Presumably because it's unbuffered, but also because it required the
preceding code to do a lot of allocations.

This was replaced with a buffered writer on stdout, as well as an extra
`Vec<u8>` that I can write everything to directly from the word and
delimiter iterators, without allocating a single new structure.
2024-07-17 09:27:57 +02:00
6485dae27b
experiment: Add command to create word permutations
This creates candidate values to brute force dictionary entries with,
by building combinations from a word list and delimiters.
2024-07-17 09:27:46 +02:00
94347d57f9
dtmt: Add command to extract words from file
As part of trying to brute force values for the dictionary,
this allows extracting candidate words from a file.
2024-07-17 09:20:54 +02:00
2daff544a5
Add subcommand for experimental operations
These may be temporary ones that help during analyzing and developing
file formats, or or long term experiments.
2024-07-17 09:18:56 +02:00
74 changed files with 2722 additions and 3520 deletions

View file

@ -1,8 +1,7 @@
# https://jake-shadle.github.io/xwin/ # https://jake-shadle.github.io/xwin/
FROM debian:bullseye-slim AS xwin FROM debian:bullseye-slim as xwin
# renovate: datasource=github-releases depName=xwin packageName=Jake-Shadle/xwin ARG XWIN_VERSION=0.5.2
ARG XWIN_VERSION=0.6.6
ARG XWIN_PREFIX="xwin-$XWIN_VERSION-x86_64-unknown-linux-musl" ARG XWIN_PREFIX="xwin-$XWIN_VERSION-x86_64-unknown-linux-musl"
ADD https://github.com/Jake-Shadle/xwin/releases/download/$XWIN_VERSION/$XWIN_PREFIX.tar.gz /root/$XWIN_PREFIX.tar.gz ADD https://github.com/Jake-Shadle/xwin/releases/download/$XWIN_VERSION/$XWIN_PREFIX.tar.gz /root/$XWIN_PREFIX.tar.gz
@ -32,7 +31,7 @@ RUN set -eux; \
# And to keep that to a minimum, we still delete the stuff we don't need. # And to keep that to a minimum, we still delete the stuff we don't need.
rm -rf /root/.xwin-cache; rm -rf /root/.xwin-cache;
FROM rust:slim-bullseye AS linux FROM rust:slim-bullseye as linux
RUN set -eux; \ RUN set -eux; \
apt-get update; \ apt-get update; \
@ -59,10 +58,9 @@ WORKDIR /src/dtmt
COPY lib/oodle/*.so lib/oodle/*.a /src/ COPY lib/oodle/*.so lib/oodle/*.a /src/
FROM linux AS msvc FROM linux as msvc
# renovate: datasource=github-releases depName=llvm packageName=llvm/llvm-project ARG LLVM_VERSION=18
ARG LLVM_VERSION=20
ENV KEYRINGS /usr/local/share/keyrings ENV KEYRINGS /usr/local/share/keyrings
ADD https://apt.llvm.org/llvm-snapshot.gpg.key /root/llvm-snapshot.gpg.key ADD https://apt.llvm.org/llvm-snapshot.gpg.key /root/llvm-snapshot.gpg.key

View file

@ -6,30 +6,24 @@ resource_types:
- name: gitea-package - name: gitea-package
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/gitea-package repository: registry.local:5000/gitea-package
username: ((registry_user))
password: ((registry_password))
- name: gitea-status - name: gitea-status
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/gitea-status repository: registry.local:5000/gitea-status
username: ((registry_user))
password: ((registry_password))
- name: gitea-pr - name: gitea-pr
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/gitea-pr repository: registry.local:5000/gitea-pr
username: ((registry_user))
password: ((registry_password))
resources: resources:
- name: repo - name: repo
type: git type: git
source: source:
uri: https://git.sclu1034.dev/bitsquid_dt/dtmt uri: http://forgejo:3000/bitsquid_dt/dtmt
branch: master branch: master
- name: repo-pr - name: repo-pr
@ -44,7 +38,7 @@ resources:
type: gitea-package type: gitea-package
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
type: generic type: generic
name: dtmt name: dtmt
@ -54,7 +48,7 @@ resources:
type: gitea-status type: gitea-status
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
repo: dtmt repo: dtmt
context: build/msvc context: build/msvc
@ -64,7 +58,7 @@ resources:
type: gitea-status type: gitea-status
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
repo: dtmt repo: dtmt
context: build/linux context: build/linux
@ -88,12 +82,9 @@ jobs:
values: ((.:prs)) values: ((.:prs))
set_pipeline: dtmt-pr set_pipeline: dtmt-pr
file: repo/.ci/pipelines/pr.yml file: repo/.ci/pipelines/pr.yml
public: true
vars: vars:
pr: ((.:pr)) pr: ((.:pr))
gitea_api_key: ((gitea_api_key)) gitea_api_key: ((gitea_api_key))
registry_user: ((registry_user))
registry_password: ((registry_password))
instance_vars: instance_vars:
number: ((.:pr.number)) number: ((.:pr.number))
@ -134,8 +125,8 @@ jobs:
vars: vars:
pr: "" pr: ""
target: msvc target: msvc
registry_user: ((registry_user)) gitea_url: http://forgejo:3000
registry_password: ((registry_password)) gitea_api_key: ((gitea_api_key))
- load_var: version_number - load_var: version_number
reveal: true reveal: true
@ -151,21 +142,10 @@ jobs:
fail_fast: true fail_fast: true
override: true override: true
globs: globs:
- artifact/dtmt
- artifact/dtmm
- artifact/*.exe - artifact/*.exe
- artifact/*.exe.sha256 - artifact/*.sha256
- put: package
resource: gitea-package
no_get: true
inputs:
- artifact
params:
version: master
fail_fast: true
override: true
globs:
- artifact/*.exe
- artifact/*.exe.sha256
- name: build-linux - name: build-linux
on_success: on_success:
@ -203,10 +183,8 @@ jobs:
vars: vars:
pr: "" pr: ""
target: linux target: linux
gitea_url: https://git.sclu1034.dev gitea_url: http://forgejo:3000
gitea_api_key: ((gitea_api_key)) gitea_api_key: ((gitea_api_key))
registry_user: ((registry_user))
registry_password: ((registry_password))
- load_var: version_number - load_var: version_number
reveal: true reveal: true
@ -224,20 +202,5 @@ jobs:
globs: globs:
- artifact/dtmt - artifact/dtmt
- artifact/dtmm - artifact/dtmm
- artifact/dtmm.sha256 - artifact/*.exe
- artifact/dtmt.sha256 - artifact/*.sha256
- put: package
resource: gitea-package
no_get: true
inputs:
- artifact
params:
version: master
fail_fast: true
override: true
globs:
- artifact/dtmt
- artifact/dtmm
- artifact/dtmm.sha256
- artifact/dtmt.sha256

View file

@ -18,8 +18,6 @@ jobs:
file: repo/.ci/tasks/build.yml file: repo/.ci/tasks/build.yml
vars: vars:
target: msvc target: msvc
registry_user: ((registry_user))
registry_password: ((registry_password))
- name: build-linux - name: build-linux
plan: plan:
- get: repo - get: repo
@ -28,5 +26,3 @@ jobs:
file: repo/.ci/tasks/build.yml file: repo/.ci/tasks/build.yml
vars: vars:
target: linux target: linux
registry_user: ((registry_user))
registry_password: ((registry_password))

View file

@ -6,30 +6,26 @@ resource_types:
- name: gitea-package - name: gitea-package
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/gitea-package repository: registry.local:5000/gitea-package
username: ((registry_user))
password: ((registry_password))
- name: gitea-status - name: gitea-status
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/gitea-status repository: registry.local:5000/gitea-status
username: ((registry_user))
password: ((registry_password))
resources: resources:
- name: repo - name: repo
type: git type: git
source: source:
uri: https://git.sclu1034.dev/bitsquid_dt/dtmt uri: http://forgejo:3000/bitsquid_dt/dtmt
branch: ((pr.head.ref)) branch: ((pr.head.ref))
- name: gitea-package - name: gitea-package
type: gitea-package type: gitea-package
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
type: generic type: generic
name: dtmt name: dtmt
@ -38,7 +34,7 @@ resources:
type: gitea-status type: gitea-status
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
repo: dtmt repo: dtmt
context: lint/clippy context: lint/clippy
@ -48,7 +44,7 @@ resources:
type: gitea-status type: gitea-status
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
repo: dtmt repo: dtmt
context: build/msvc context: build/msvc
@ -58,7 +54,7 @@ resources:
type: gitea-status type: gitea-status
source: source:
access_token: ((gitea_api_key)) access_token: ((gitea_api_key))
url: https://git.sclu1034.dev url: http://forgejo:3000
owner: bitsquid_dt owner: bitsquid_dt
repo: dtmt repo: dtmt
context: build/linux context: build/linux
@ -101,8 +97,6 @@ jobs:
file: repo/.ci/tasks/clippy.yml file: repo/.ci/tasks/clippy.yml
vars: vars:
gitea_api_key: ((gitea_api_key)) gitea_api_key: ((gitea_api_key))
registry_user: ((registry_user))
registry_password: ((registry_password))
- name: build-msvc - name: build-msvc
@ -141,10 +135,8 @@ jobs:
vars: vars:
target: msvc target: msvc
pr: ((pr)) pr: ((pr))
gitea_url: https://git.sclu1034.dev gitea_url: http://forgejo:3000
gitea_api_key: ((gitea_api_key)) gitea_api_key: ((gitea_api_key))
registry_user: ((registry_user))
registry_password: ((registry_password))
- load_var: version_number - load_var: version_number
reveal: true reveal: true
@ -201,10 +193,8 @@ jobs:
vars: vars:
target: linux target: linux
pr: ((pr)) pr: ((pr))
gitea_url: https://git.sclu1034.dev gitea_url: http://forgejo:3000
gitea_api_key: ((gitea_api_key)) gitea_api_key: ((gitea_api_key))
registry_user: ((registry_user))
registry_password: ((registry_password))
- load_var: version_number - load_var: version_number
reveal: true reveal: true

View file

@ -20,15 +20,12 @@ install_artifact() {
cd "repo" cd "repo"
PR=${PR:-} PR=${PR:-}
ref=$(cat .git/ref || echo "HEAD")
if [ -n "$PR" ]; then if [ -n "$PR" ]; then
title "PR: $(echo "$PR" | jq '.number') - $(echo "$PR" | jq '.title')" title "PR: $(echo "$PR" | jq '.number') - $(echo "$PR" | jq '.title')"
ref="pr-$(echo "$PR" | jq '.number')-$(git rev-parse --short "$ref" 2>/dev/null || echo 'manual')" ref="pr-$(echo "$PR" | jq '.number')-$(git rev-parse --short "$(cat .git/ref || echo "HEAD")" 2>/dev/null || echo 'manual')"
elif [ -f ".git/branch" ]; then
ref=$(cat .git/branch)-$(git rev-parse --short "$ref")
else else
ref=$(git rev-parse --short "$ref") ref=$(git describe --tags)
fi fi
title "Version: '$ref'" title "Version: '$ref'"

View file

@ -6,9 +6,7 @@ image_resource:
name: ctmt-bi-base-((target)) name: ctmt-bi-base-((target))
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/dtmt-ci-base-((target)) repository: registry.local:5000/dtmt-ci-base-((target))
username: ((registry_user))
password: ((registry_password))
tag: latest tag: latest
inputs: inputs:
@ -24,6 +22,7 @@ caches:
params: params:
CI: "true" CI: "true"
TARGET: ((target)) TARGET: ((target))
GITEA_API_KEY: ((gitea_api_key))
PR: ((pr)) PR: ((pr))
OUTPUT: artifact OUTPUT: artifact

View file

@ -10,6 +10,6 @@ title "Install clippy"
rustup component add clippy rustup component add clippy
title "Run clippy" title "Run clippy"
cargo clippy --color always --no-deps -- -D warnings cargo clippy --color always --no-deps
title "Done" title "Done"

View file

@ -6,9 +6,7 @@ image_resource:
name: dtmt-ci-base-linux name: dtmt-ci-base-linux
type: registry-image type: registry-image
source: source:
repository: registry.sclu1034.dev/dtmt-ci-base-linux repository: registry.local:5000/dtmt-ci-base-linux
username: ((registry_user))
password: ((registry_password))
tag: latest tag: latest
inputs: inputs:

13
.gitmodules vendored
View file

@ -1,11 +1,14 @@
[submodule "lib/serde_sjson"]
path = lib/serde_sjson
url = https://git.sclu1034.dev/lucas/serde_sjson.git
[submodule "lib/luajit2-sys"]
path = lib/luajit2-sys
url = https://github.com/sclu1034/luajit2-sys.git
[submodule "lib/color-eyre"] [submodule "lib/color-eyre"]
path = lib/color-eyre path = lib/color-eyre
url = https://github.com/sclu1034/color-eyre.git url = https://github.com/sclu1034/color-eyre.git
branch = "fork" branch = "fork"
[submodule "lib/ansi-parser"] [submodule "lib/ansi-parser"]
path = lib/ansi-parser path = lib/ansi-parser
url = https://gitlab.com/lschwiderski/ansi-parser.git url = https://gitlab.com/lschwiderski/ansi-parser.git
branch = "issue/outdated-nom" branch = "issue/outdated-nom"
[submodule "lib/luajit2-sys/luajit"]
path = lib/luajit2-sys/luajit
url = https://github.com/LuaJIT/LuaJIT.git

View file

@ -1,44 +0,0 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended",
":combinePatchMinorReleases",
":enableVulnerabilityAlerts"
],
"prConcurrentLimit": 10,
"branchPrefix": "renovate/",
"baseBranches": [
"$default",
"/^release\\/.*/"
],
"ignorePaths": [
"lib/color_eyre/**",
"lib/ansi-parser/**",
"lib/luajit2-sys/**",
"**/target/**"
],
"customManagers": [
{
"customType": "regex",
"description": "Update _VERSION variables in Dockerfiles",
"fileMatch": [
"(^|/|\\.)Dockerfile$",
"(^|/)Dockerfile\\.[^/]*$"
],
"matchStrings": [
"# renovate: datasource=(?<datasource>[a-z-]+?)(?: depName=(?<depName>.+?))? packageName=(?<packageName>.+?)(?: versioning=(?<versioning>[a-z-]+?))?\\s(?:ENV|ARG) .+?_VERSION=(?<currentValue>.+?)\\s"
]
}
],
"packageRules": [
{
"matchDatasources": [
"github-releases"
],
"matchPackageNames": [
"llvm/llvm-project"
],
"extractVersion": "^llvmorg-(?<version>\\d+)\\.\\d+\\.\\d+$"
}
]
}

View file

@ -20,8 +20,6 @@
- dtmm: fetch file version for Nexus mods - dtmm: fetch file version for Nexus mods
- dtmm: handle `nxm://` URIs via IPC and import the corresponding mod - dtmm: handle `nxm://` URIs via IPC and import the corresponding mod
- dtmm: Add button to open mod on nexusmods.com - dtmm: Add button to open mod on nexusmods.com
- dtmt: Implement commands to list bundles and contents
- dtmt: Implement command to search for files
=== Fixed === Fixed

1807
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -6,61 +6,18 @@ members = [
"lib/dtmt-shared", "lib/dtmt-shared",
"lib/oodle", "lib/oodle",
"lib/sdk", "lib/sdk",
"lib/serde_sjson",
"lib/luajit2-sys", "lib/luajit2-sys",
"lib/color-eyre", "lib/color-eyre",
] ]
exclude = ["lib/color-eyre"] exclude = ["lib/color-eyre"]
[workspace.dependencies] [workspace.dependencies]
ansi-parser = "0.9.1" zip = { version = "2.1.3", default-features = false, features = ["deflate", "bzip2", "zstd", "time"] }
ansi_term = "0.12.1"
async-recursion = "1.0.5" [patch.crates-io]
bincode = "2.0.0"
bindgen = "0.72.0"
bitflags = "2.5.0"
byteorder = "1.4.3"
cc = { version = "1.2.27", features = ["parallel"] }
clap = { version = "4.0.15", features = ["color", "derive", "std", "cargo", "string", "unicode"] }
cli-table = { version = "0.5.0", default-features = false, features = ["derive"] }
color-eyre = { path = "lib/color-eyre" } color-eyre = { path = "lib/color-eyre" }
colors-transform = "0.2.11" ansi-parser = { git = "https://gitlab.com/lschwiderski/ansi-parser.git", branch = "issue/outdated-heapless", version = "0.9.1" }
confy = "1.0.0"
csv-async = { version = "1.2.4", features = ["tokio", "serde"] }
druid = { version = "0.8", features = ["im", "serde", "image", "png", "jpeg", "bmp", "webp", "svg"] }
druid-widget-nursery = "0.1"
dtmt-shared = { path = "lib/dtmt-shared" }
fastrand = "2.1.0"
fs_extra = "1.1.0"
futures = "0.3.25"
futures-util = "0.3.24"
glob = "0.3.0"
interprocess = "2.1.0"
lazy_static = "1.4.0"
libc = "0.2.174"
luajit2-sys = { path = "lib/luajit2-sys" }
minijinja = { version = "2.0.1", default-features = false, features = ["serde"] }
nanorand = "0.8.0"
nexusmods = { path = "lib/nexusmods" }
notify = "8.0.0"
oodle = { path = "lib/oodle" }
open = "5.0.1"
path-clean = "1.0.1"
path-slash = "0.2.1"
pin-project-lite = "0.2.9"
promptly = "0.3.1"
sdk = { path = "lib/sdk" }
serde = { version = "1.0.152", features = ["derive", "rc"] }
serde_sjson = "1.2.1"
steamlocate = "2.0.0-beta.2"
strip-ansi-escapes = "0.2.0"
time = { version = "0.3.20", features = ["serde", "serde-well-known", "local-offset", "formatting", "macros"] }
tokio = { version = "1.23.0", features = ["rt-multi-thread", "fs", "process", "macros", "tracing", "io-util", "io-std"] }
tokio-stream = { version = "0.1.12", features = ["fs", "io-util"] }
tracing = { version = "0.1.37", features = ["async-await"] }
tracing-error = "0.2.0"
tracing-subscriber = { version = "0.3.16", features = ["env-filter"] }
usvg = "0.25.0"
zip = { version = "4.0.0", default-features = false, features = ["deflate", "bzip2", "zstd", "time"] }
[profile.dev.package.backtrace] [profile.dev.package.backtrace]
opt-level = 3 opt-level = 3

View file

@ -40,8 +40,6 @@ set-base-pipeline:
--pipeline dtmt \ --pipeline dtmt \
--config .ci/pipelines/base.yml \ --config .ci/pipelines/base.yml \
-v gitea_api_key=${GITEA_API_KEY} \ -v gitea_api_key=${GITEA_API_KEY} \
-v registry_user=${REGISTRY_USER} \
-v registry_password=${REGISTRY_PASSWORD} \
-v owner=bitsquid_dt \ -v owner=bitsquid_dt \
-v repo=dtmt -v repo=dtmt

View file

@ -12,37 +12,37 @@ license-file = "LICENSE"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
ansi-parser = { workspace = true } ansi-parser = "0.9.0"
async-recursion = { workspace = true } async-recursion = "1.0.5"
bincode = { workspace = true } bincode = "1.3.3"
bitflags = { workspace = true } bitflags = "2.5.0"
clap = { workspace = true } clap = { version = "4.0.15", features = ["color", "derive", "std", "cargo", "string", "unicode"] }
color-eyre = { workspace = true } color-eyre = "0.6.2"
colors-transform = { workspace = true } colors-transform = "0.2.11"
confy = { workspace = true } confy = "0.6.1"
druid = { workspace = true } druid = { version = "0.8", features = ["im", "serde", "image", "png", "jpeg", "bmp", "webp", "svg"] }
druid-widget-nursery = { workspace = true } druid-widget-nursery = "0.1"
dtmt-shared = { workspace = true } dtmt-shared = { path = "../../lib/dtmt-shared", version = "*" }
futures = { workspace = true } futures = "0.3.25"
interprocess = { workspace = true } interprocess = "2.1.0"
lazy_static = { workspace = true } lazy_static = "1.4.0"
luajit2-sys = { workspace = true } luajit2-sys = { path = "../../lib/luajit2-sys", version = "*" }
minijinja = { workspace = true } minijinja = { version = "2.0.1", default-features = false }
nexusmods = { workspace = true } nexusmods = { path = "../../lib/nexusmods", version = "*" }
oodle = { workspace = true } oodle = { path = "../../lib/oodle", version = "*" }
open = { workspace = true } open = "5.0.1"
path-slash = { workspace = true } path-slash = "0.2.1"
sdk = { workspace = true } sdk = { path = "../../lib/sdk", version = "*" }
serde = { workspace = true } serde = { version = "1.0.152", features = ["derive", "rc"] }
serde_sjson = { workspace = true } serde_sjson = { path = "../../lib/serde_sjson", version = "*" }
strip-ansi-escapes = { workspace = true } strip-ansi-escapes = "0.2.0"
time = { workspace = true } time = { version = "0.3.20", features = ["serde", "serde-well-known", "local-offset"] }
tokio = { workspace = true } tokio = { version = "1.23.0", features = ["rt", "fs", "tracing", "sync"] }
tokio-stream = { workspace = true } tokio-stream = { version = "0.1.12", features = ["fs"] }
tracing = { workspace = true } tracing = "0.1.37"
tracing-error = { workspace = true } tracing-error = "0.2.0"
tracing-subscriber = { workspace = true } tracing-subscriber = { version = "0.3.16", features = ["env-filter"] }
usvg = { workspace = true } usvg = "0.25.0"
zip = { workspace = true } zip = { workspace = true }
[build-dependencies] [build-dependencies]

View file

@ -116,14 +116,14 @@ async fn patch_game_settings(state: Arc<ActionState>) -> Result<()> {
eyre::bail!("couldn't find 'boot_script' field"); eyre::bail!("couldn't find 'boot_script' field");
}; };
f.write_all(&settings.as_bytes()[0..i]).await?; f.write_all(settings[0..i].as_bytes()).await?;
f.write_all(b"boot_script = \"scripts/mod_main\"").await?; f.write_all(b"boot_script = \"scripts/mod_main\"").await?;
let Some(j) = settings[i..].find('\n') else { let Some(j) = settings[i..].find('\n') else {
eyre::bail!("couldn't find end of 'boot_script' field"); eyre::bail!("couldn't find end of 'boot_script' field");
}; };
f.write_all(&settings.as_bytes()[(i + j)..]).await?; f.write_all(settings[(i + j)..].as_bytes()).await?;
Ok(()) Ok(())
} }
@ -324,11 +324,11 @@ async fn build_bundles(state: Arc<ActionState>) -> Result<Vec<Bundle>> {
let mut bundles = Vec::new(); let mut bundles = Vec::new();
let mut add_lua_asset = |name: &str, data: &str| { let mut add_lua_asset = |name, data: &str| {
let span = tracing::info_span!("Compiling Lua", name, data_len = data.len()); let span = tracing::info_span!("Compiling Lua", name, data_len = data.len());
let _enter = span.enter(); let _enter = span.enter();
let file = lua::compile(name.to_string(), data).wrap_err("Failed to compile Lua")?; let file = lua::compile(name, data).wrap_err("Failed to compile Lua")?;
mod_bundle.add_file(file); mod_bundle.add_file(file);
@ -453,7 +453,10 @@ async fn build_bundles(state: Arc<ActionState>) -> Result<Vec<Bundle>> {
} }
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
async fn patch_boot_bundle(state: Arc<ActionState>, deployment_info: &str) -> Result<Vec<Bundle>> { async fn patch_boot_bundle(
state: Arc<ActionState>,
deployment_info: &String,
) -> Result<Vec<Bundle>> {
let bundle_dir = Arc::new(state.game_dir.join("bundle")); let bundle_dir = Arc::new(state.game_dir.join("bundle"));
let bundle_path = bundle_dir.join(format!("{:x}", Murmur64::hash(BOOT_BUNDLE_NAME.as_bytes()))); let bundle_path = bundle_dir.join(format!("{:x}", Murmur64::hash(BOOT_BUNDLE_NAME.as_bytes())));
@ -469,7 +472,7 @@ async fn patch_boot_bundle(state: Arc<ActionState>, deployment_info: &str) -> Re
} }
.instrument(tracing::trace_span!("read boot bundle")) .instrument(tracing::trace_span!("read boot bundle"))
.await .await
.wrap_err_with(|| format!("Failed to read bundle '{BOOT_BUNDLE_NAME}'"))?; .wrap_err_with(|| format!("Failed to read bundle '{}'", BOOT_BUNDLE_NAME))?;
{ {
tracing::trace!("Adding mod package file to boot bundle"); tracing::trace!("Adding mod package file to boot bundle");
@ -514,8 +517,8 @@ async fn patch_boot_bundle(state: Arc<ActionState>, deployment_info: &str) -> Re
.wrap_err("Failed to render template `mod_main.lua`")?; .wrap_err("Failed to render template `mod_main.lua`")?;
tracing::trace!("Main script rendered:\n===========\n{}\n=============", lua); tracing::trace!("Main script rendered:\n===========\n{}\n=============", lua);
let file = lua::compile(MOD_BOOT_SCRIPT.to_string(), lua) let file =
.wrap_err("Failed to compile mod main Lua file")?; lua::compile(MOD_BOOT_SCRIPT, lua).wrap_err("Failed to compile mod main Lua file")?;
boot_bundle.add_file(file); boot_bundle.add_file(file);
} }
@ -587,7 +590,11 @@ fn build_deployment_data(
.map(|bundle| format!("{:x}", bundle.name().to_murmur64())) .map(|bundle| format!("{:x}", bundle.name().to_murmur64()))
.collect(), .collect(),
// TODO: // TODO:
mod_folders: mod_folders.as_ref().to_vec(), mod_folders: mod_folders
.as_ref()
.iter()
.map(|folder| folder.clone())
.collect(),
}; };
serde_sjson::to_string(&info).wrap_err("Failed to serizalize deployment data") serde_sjson::to_string(&info).wrap_err("Failed to serizalize deployment data")
} }

View file

@ -91,14 +91,14 @@ async fn patch_game_settings(state: Arc<ActionState>) -> Result<()> {
eyre::bail!("couldn't find 'boot_script' field"); eyre::bail!("couldn't find 'boot_script' field");
}; };
f.write_all(&settings.as_bytes()[0..i]).await?; f.write_all(settings[0..i].as_bytes()).await?;
f.write_all(b"boot_script = \"scripts/mod_main\"").await?; f.write_all(b"boot_script = \"scripts/mod_main\"").await?;
let Some(j) = settings[i..].find('\n') else { let Some(j) = settings[i..].find('\n') else {
eyre::bail!("couldn't find end of 'boot_script' field"); eyre::bail!("couldn't find end of 'boot_script' field");
}; };
f.write_all(&settings.as_bytes()[(i + j)..]).await?; f.write_all(settings[(i + j)..].as_bytes()).await?;
Ok(()) Ok(())
} }
@ -208,7 +208,7 @@ pub(crate) async fn reset_mod_deployment(state: ActionState) -> Result<()> {
for p in paths { for p in paths {
let path = bundle_dir.join(p); let path = bundle_dir.join(p);
let backup = bundle_dir.join(format!("{p}.bak")); let backup = bundle_dir.join(&format!("{}.bak", p));
let res = async { let res = async {
tracing::debug!( tracing::debug!(

View file

@ -297,7 +297,6 @@ fn extract_mod_config<R: Read + Seek>(archive: &mut ZipArchive<R>) -> Result<(Mo
packages: Vec::new(), packages: Vec::new(),
resources, resources,
depends: Vec::new(), depends: Vec::new(),
name_overrides: Default::default(),
}; };
Ok((cfg, root)) Ok((cfg, root))
@ -363,7 +362,7 @@ fn extract_legacy_mod<R: Read + Seek>(
for i in 0..file_count { for i in 0..file_count {
let mut f = archive let mut f = archive
.by_index(i) .by_index(i)
.wrap_err_with(|| format!("Failed to get file at index {i}"))?; .wrap_err_with(|| format!("Failed to get file at index {}", i))?;
let Some(name) = f.enclosed_name().map(|p| p.to_path_buf()) else { let Some(name) = f.enclosed_name().map(|p| p.to_path_buf()) else {
let err = eyre::eyre!("File name in archive is not a safe path value.").suggestion( let err = eyre::eyre!("File name in archive is not a safe path value.").suggestion(
@ -397,7 +396,7 @@ fn extract_legacy_mod<R: Read + Seek>(
tracing::trace!("Writing file '{}'", name.display()); tracing::trace!("Writing file '{}'", name.display());
let mut out = std::fs::OpenOptions::new() let mut out = std::fs::OpenOptions::new()
.write(true) .write(true)
.truncate(true) .create(true)
.open(&name) .open(&name)
.wrap_err_with(|| format!("Failed to open file '{}'", name.display()))?; .wrap_err_with(|| format!("Failed to open file '{}'", name.display()))?;
@ -426,7 +425,7 @@ pub(crate) async fn import_from_file(state: ActionState, info: FileInfo) -> Resu
let mod_info = api let mod_info = api
.mods_id(id) .mods_id(id)
.await .await
.wrap_err_with(|| format!("Failed to query mod {id} from Nexus"))?; .wrap_err_with(|| format!("Failed to query mod {} from Nexus", id))?;
let version = match api.file_version(id, timestamp).await { let version = match api.file_version(id, timestamp).await {
Ok(version) => version, Ok(version) => version,
@ -461,13 +460,13 @@ pub(crate) async fn import_from_file(state: ActionState, info: FileInfo) -> Resu
pub(crate) async fn import_from_nxm(state: ActionState, uri: String) -> Result<ModInfo> { pub(crate) async fn import_from_nxm(state: ActionState, uri: String) -> Result<ModInfo> {
let url = uri let url = uri
.parse() .parse()
.wrap_err_with(|| format!("Invalid Uri '{uri}'"))?; .wrap_err_with(|| format!("Invalid Uri '{}'", uri))?;
let api = NexusApi::new(state.nexus_api_key.to_string())?; let api = NexusApi::new(state.nexus_api_key.to_string())?;
let (mod_info, file_info, data) = api let (mod_info, file_info, data) = api
.handle_nxm(url) .handle_nxm(url)
.await .await
.wrap_err_with(|| format!("Failed to download mod from NXM uri '{uri}'"))?; .wrap_err_with(|| format!("Failed to download mod from NXM uri '{}'", uri))?;
let nexus = NexusInfo::from(mod_info); let nexus = NexusInfo::from(mod_info);
import_mod(state, Some((nexus, file_info.version)), data).await import_mod(state, Some((nexus, file_info.version)), data).await
@ -524,7 +523,7 @@ pub(crate) async fn import_mod(
let data = api let data = api
.picture(url) .picture(url)
.await .await
.wrap_err_with(|| format!("Failed to download Nexus image from '{url}'"))?; .wrap_err_with(|| format!("Failed to download Nexus image from '{}'", url))?;
let img = image_data_to_buffer(&data)?; let img = image_data_to_buffer(&data)?;

View file

@ -47,19 +47,15 @@ fn notify_nxm_download(
.to_ns_name::<GenericNamespaced>() .to_ns_name::<GenericNamespaced>()
.expect("Invalid socket name"), .expect("Invalid socket name"),
) )
.wrap_err_with(|| format!("Failed to connect to '{IPC_ADDRESS}'")) .wrap_err_with(|| format!("Failed to connect to '{}'", IPC_ADDRESS))
.suggestion("Make sure the main window is open.")?; .suggestion("Make sure the main window is open.")?;
tracing::debug!("Connected to main process at '{}'", IPC_ADDRESS); tracing::debug!("Connected to main process at '{}'", IPC_ADDRESS);
let bincode_config = bincode::config::standard(); bincode::serialize_into(&mut stream, uri.as_ref()).wrap_err("Failed to send URI")?;
bincode::encode_into_std_write(uri.as_ref(), &mut stream, bincode_config)
.wrap_err("Failed to send URI")?;
// We don't really care what the message is, we just need an acknowledgement. // We don't really care what the message is, we just need an acknowledgement.
let _: String = bincode::decode_from_std_read(&mut stream, bincode_config) let _: String = bincode::deserialize_from(&mut stream).wrap_err("Failed to receive reply")?;
.wrap_err("Failed to receive reply")?;
tracing::info!( tracing::info!(
"Notified DTMM with uri '{}'. Check the main window.", "Notified DTMM with uri '{}'. Check the main window.",
@ -159,38 +155,27 @@ fn main() -> Result<()> {
loop { loop {
let res = server.accept().wrap_err_with(|| { let res = server.accept().wrap_err_with(|| {
format!("IPC server failed to listen on '{IPC_ADDRESS}'") format!("IPC server failed to listen on '{}'", IPC_ADDRESS)
}); });
match res { match res {
Ok(mut stream) => { Ok(mut stream) => {
let res = bincode::decode_from_std_read( let res = bincode::deserialize_from(&mut stream)
&mut stream, .wrap_err("Failed to read message")
bincode::config::standard(), .and_then(|uri: String| {
) tracing::trace!(uri, "Received NXM uri");
.wrap_err("Failed to read message")
.and_then(|uri: String| {
tracing::trace!(uri, "Received NXM uri");
event_sink event_sink
.submit_command(ACTION_HANDLE_NXM, uri, druid::Target::Auto) .submit_command(ACTION_HANDLE_NXM, uri, druid::Target::Auto)
.wrap_err("Failed to start NXM download") .wrap_err("Failed to start NXM download")
}); });
match res { match res {
Ok(()) => { Ok(()) => {
let _ = bincode::encode_into_std_write( let _ = bincode::serialize_into(&mut stream, "Ok");
"Ok",
&mut stream,
bincode::config::standard(),
);
} }
Err(err) => { Err(err) => {
tracing::error!("{:?}", err); tracing::error!("{:?}", err);
let _ = bincode::encode_into_std_write( let _ = bincode::serialize_into(&mut stream, "Error");
"Error",
&mut stream,
bincode::config::standard(),
);
} }
} }
} }

View file

@ -108,19 +108,20 @@ impl std::fmt::Debug for AsyncAction {
match self { match self {
AsyncAction::DeployMods(_) => write!(f, "AsyncAction::DeployMods(_state)"), AsyncAction::DeployMods(_) => write!(f, "AsyncAction::DeployMods(_state)"),
AsyncAction::ResetDeployment(_) => write!(f, "AsyncAction::ResetDeployment(_state)"), AsyncAction::ResetDeployment(_) => write!(f, "AsyncAction::ResetDeployment(_state)"),
AsyncAction::AddMod(_, info) => write!(f, "AsyncAction::AddMod(_state, {info:?})"), AsyncAction::AddMod(_, info) => write!(f, "AsyncAction::AddMod(_state, {:?})", info),
AsyncAction::DeleteMod(_, info) => { AsyncAction::DeleteMod(_, info) => {
write!(f, "AsyncAction::DeleteMod(_state, {info:?})") write!(f, "AsyncAction::DeleteMod(_state, {:?})", info)
} }
AsyncAction::SaveSettings(_) => write!(f, "AsyncAction::SaveSettings(_state)"), AsyncAction::SaveSettings(_) => write!(f, "AsyncAction::SaveSettings(_state)"),
AsyncAction::CheckUpdates(_) => write!(f, "AsyncAction::CheckUpdates(_state)"), AsyncAction::CheckUpdates(_) => write!(f, "AsyncAction::CheckUpdates(_state)"),
AsyncAction::LoadInitial((path, is_default)) => write!( AsyncAction::LoadInitial((path, is_default)) => write!(
f, f,
"AsyncAction::LoadInitial(({path:?}, {is_default:?}))" "AsyncAction::LoadInitial(({:?}, {:?}))",
path, is_default
), ),
AsyncAction::Log(_) => write!(f, "AsyncAction::Log(_)"), AsyncAction::Log(_) => write!(f, "AsyncAction::Log(_)"),
AsyncAction::NxmDownload(_, uri) => { AsyncAction::NxmDownload(_, uri) => {
write!(f, "AsyncAction::NxmDownload(_state, {uri})") write!(f, "AsyncAction::NxmDownload(_state, {})", uri)
} }
} }
} }
@ -447,7 +448,7 @@ impl AppDelegate<State> for Delegate {
if let Err(err) = open::that_detached(Arc::as_ref(url)) { if let Err(err) = open::that_detached(Arc::as_ref(url)) {
tracing::error!( tracing::error!(
"{:?}", "{:?}",
Report::new(err).wrap_err(format!("Failed to open url '{url}'")) Report::new(err).wrap_err(format!("Failed to open url '{}'", url))
); );
} }

View file

@ -76,7 +76,7 @@ impl ColorExt for Color {
fn darken(&self, fac: f32) -> Self { fn darken(&self, fac: f32) -> Self {
let (r, g, b, a) = self.as_rgba(); let (r, g, b, a) = self.as_rgba();
let rgb = Rgb::from(r as f32, g as f32, b as f32); let rgb = Rgb::from(r as f32, g as f32, b as f32);
let rgb = rgb.lighten(-fac); let rgb = rgb.lighten(-1. * fac);
Self::rgba( Self::rgba(
rgb.get_red() as f64, rgb.get_red() as f64,
rgb.get_green() as f64, rgb.get_green() as f64,

View file

@ -5,7 +5,6 @@ use druid::{
use crate::state::{State, ACTION_SET_DIRTY, ACTION_START_SAVE_SETTINGS}; use crate::state::{State, ACTION_SET_DIRTY, ACTION_START_SAVE_SETTINGS};
#[allow(dead_code)]
pub struct DisabledButtonController; pub struct DisabledButtonController;
impl<T: Data> Controller<T, Button<T>> for DisabledButtonController { impl<T: Data> Controller<T, Button<T>> for DisabledButtonController {

View file

@ -34,9 +34,9 @@ pub fn error<T: Data>(err: Report, _parent: WindowHandle) -> WindowDesc<T> {
// The second to last one, the context to the root cause // The second to last one, the context to the root cause
let context = err.chain().nth(count - 2).unwrap(); let context = err.chain().nth(count - 2).unwrap();
(format!("{first}!"), format!("{context}: {root}")) (format!("{first}!"), format!("{}: {}", context, root))
} else { } else {
("An error occurred!".to_string(), format!("{first}: {root}")) ("An error occurred!".to_string(), format!("{}: {}", first, root))
} }
} }
}; };

View file

@ -348,7 +348,7 @@ fn build_mod_details_info() -> impl Widget<State> {
let nexus_link = Maybe::or_empty(|| { let nexus_link = Maybe::or_empty(|| {
let link = Label::raw().lens(NexusInfo::id.map( let link = Label::raw().lens(NexusInfo::id.map(
|id| { |id| {
let url = format!("https://nexusmods.com/warhammer40kdarktide/mods/{id}"); let url = format!("https://nexusmods.com/warhammer40kdarktide/mods/{}", id);
let mut builder = RichTextBuilder::new(); let mut builder = RichTextBuilder::new();
builder builder
.push("Open on Nexusmods") .push("Open on Nexusmods")

View file

@ -4,37 +4,38 @@ version = "0.3.0"
edition = "2021" edition = "2021"
[dependencies] [dependencies]
async-recursion = { workspace = true } clap = { version = "4.0.15", features = ["color", "derive", "std", "cargo", "unicode"] }
clap = { workspace = true } cli-table = { version = "0.4.7", default-features = false, features = ["derive"] }
cli-table = { workspace = true } color-eyre = "0.6.2"
color-eyre = { workspace = true } confy = "0.6.1"
confy = { workspace = true } csv-async = { version = "1.2.4", features = ["tokio", "serde"] }
csv-async = { workspace = true } dtmt-shared = { path = "../../lib/dtmt-shared", version = "*" }
dtmt-shared = { workspace = true } futures = "0.3.25"
futures = { workspace = true } futures-util = "0.3.24"
futures-util = { workspace = true } glob = "0.3.0"
glob = { workspace = true } nanorand = "0.7.0"
luajit2-sys = { workspace = true } oodle = { path = "../../lib/oodle", version = "*" }
minijinja = { workspace = true } pin-project-lite = "0.2.9"
nanorand = { workspace = true } promptly = "0.3.1"
notify = { workspace = true } sdk = { path = "../../lib/sdk", version = "*" }
oodle = { workspace = true } serde_sjson = { path = "../../lib/serde_sjson", version = "*" }
path-clean = { workspace = true } serde = { version = "1.0.147", features = ["derive"] }
path-slash = { workspace = true } string_template = "0.2.1"
pin-project-lite = { workspace = true } tokio-stream = { version = "0.1.11", features = ["fs", "io-util"] }
promptly = { workspace = true } tokio = { version = "1.21.2", features = ["rt-multi-thread", "fs", "process", "macros", "tracing", "io-util", "io-std"] }
sdk = { workspace = true } tracing-error = "0.2.0"
serde = { workspace = true } tracing-subscriber = { version = "0.3.16", features = ["env-filter"] }
serde_sjson = { workspace = true } tracing = { version = "0.1.37", features = ["async-await"] }
tokio = { workspace = true }
tokio-stream = { workspace = true }
tracing = { workspace = true }
tracing-error = { workspace = true }
tracing-subscriber = { workspace = true }
zip = { workspace = true } zip = { workspace = true }
path-clean = "1.0.1"
# Cannot be a workspace dependencies when it's optional path-slash = "0.2.1"
async-recursion = "1.0.2"
notify = "6.1.1"
luajit2-sys = { path = "../../lib/luajit2-sys", version = "*" }
shlex = { version = "1.2.0", optional = true } shlex = { version = "1.2.0", optional = true }
atty = "0.2.14"
itertools = "0.11.0"
crossbeam = { version = "0.8.2", features = ["crossbeam-deque"] }
[dev-dependencies] [dev-dependencies]
tempfile = "3.3.0" tempfile = "3.3.0"

View file

@ -55,7 +55,6 @@ pub(crate) fn command_definition() -> Command {
) )
} }
/// Try to find a `dtmt.cfg` in the given directory or traverse up the parents.
#[tracing::instrument] #[tracing::instrument]
async fn find_project_config(dir: Option<PathBuf>) -> Result<ModConfig> { async fn find_project_config(dir: Option<PathBuf>) -> Result<ModConfig> {
let (path, mut file) = if let Some(path) = dir { let (path, mut file) = if let Some(path) = dir {
@ -103,44 +102,39 @@ async fn find_project_config(dir: Option<PathBuf>) -> Result<ModConfig> {
Ok(cfg) Ok(cfg)
} }
/// Iterate over the paths in the given `Package` and
/// compile each file by its file type.
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
async fn compile_package_files(pkg: &Package, cfg: &ModConfig) -> Result<Vec<BundleFile>> { async fn compile_package_files<P>(pkg: &Package, root: P) -> Result<Vec<BundleFile>>
let root = Arc::new(&cfg.dir); where
let name_overrides = &cfg.name_overrides; P: AsRef<Path> + std::fmt::Debug,
{
let root = Arc::new(root.as_ref());
let tasks = pkg let tasks = pkg
.iter() .iter()
.flat_map(|(file_type, names)| { .flat_map(|(file_type, paths)| {
names.iter().map(|name| { paths.iter().map(|path| {
( (
*file_type, *file_type,
name, path,
// Cloning the `Arc` here solves the issue that in the next `.map`, I need to // Cloning the `Arc` here solves the issue that in the next `.map`, I need to
// `move` the closure parameters, but can't `move` `root` before it was cloned. // `move` the closure parameters, but can't `move` `root` before it was cloned.
root.clone(), root.clone(),
) )
}) })
}) })
.map(|(file_type, name, root)| async move { .map(|(file_type, path, root)| async move {
let path = PathBuf::from(name); let sjson = fs::read_to_string(&path).await?;
let sjson = fs::read_to_string(&path)
.await
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let name = path.with_extension("").to_slash_lossy().to_string(); let mut path = path.clone();
let name = if let Some(new_name) = name_overrides.get(&name) { path.set_extension("");
let new_name = match u64::from_str_radix(new_name, 16) {
Ok(hash) => IdString64::from(hash), BundleFile::from_sjson(
Err(_) => IdString64::from(new_name.clone()), path.to_slash_lossy().to_string(),
}; file_type,
tracing::info!("Overriding '{}' -> '{}'", name, new_name.display()); sjson,
new_name root.as_ref(),
} else { )
IdString64::from(name.clone()) .await
};
BundleFile::from_sjson(name, file_type, sjson, root.as_ref()).await
}); });
let results = futures::stream::iter(tasks) let results = futures::stream::iter(tasks)
@ -151,14 +145,13 @@ async fn compile_package_files(pkg: &Package, cfg: &ModConfig) -> Result<Vec<Bun
results.into_iter().collect() results.into_iter().collect()
} }
/// Read a `.package` file, collect the referenced files
/// and compile all of them into a bundle.
#[tracing::instrument] #[tracing::instrument]
async fn build_package( async fn build_package<P1, P2>(package: P1, root: P2) -> Result<Bundle>
cfg: &ModConfig, where
package: impl AsRef<Path> + std::fmt::Debug, P1: AsRef<Path> + std::fmt::Debug,
) -> Result<Bundle> { P2: AsRef<Path> + std::fmt::Debug,
let root = &cfg.dir; {
let root = root.as_ref();
let package = package.as_ref(); let package = package.as_ref();
let mut path = root.join(package); let mut path = root.join(package);
@ -172,7 +165,7 @@ async fn build_package(
.await .await
.wrap_err_with(|| format!("Invalid package file {}", &pkg_name))?; .wrap_err_with(|| format!("Invalid package file {}", &pkg_name))?;
let files = compile_package_files(&pkg, cfg).await?; let files = compile_package_files(&pkg, root).await?;
let mut bundle = Bundle::new(pkg_name); let mut bundle = Bundle::new(pkg_name);
for file in files { for file in files {
bundle.add_file(file); bundle.add_file(file);
@ -181,8 +174,6 @@ async fn build_package(
Ok(bundle) Ok(bundle)
} }
/// Cleans the path of internal parent (`../`) or self (`./`) components,
/// and ensures that it is relative.
fn normalize_file_path<P: AsRef<Path>>(path: P) -> Result<PathBuf> { fn normalize_file_path<P: AsRef<Path>>(path: P) -> Result<PathBuf> {
let path = path.as_ref(); let path = path.as_ref();
@ -263,14 +254,14 @@ pub(crate) async fn read_project_config(dir: Option<PathBuf>) -> Result<ModConfi
Ok(cfg) Ok(cfg)
} }
#[tracing::instrument] pub(crate) async fn build<P1, P2>(
pub(crate) async fn build<P>(
cfg: &ModConfig, cfg: &ModConfig,
out_path: impl AsRef<Path> + std::fmt::Debug, out_path: P1,
game_dir: Arc<Option<P>>, game_dir: Arc<Option<P2>>,
) -> Result<()> ) -> Result<()>
where where
P: AsRef<Path> + std::fmt::Debug, P1: AsRef<Path>,
P2: AsRef<Path>,
{ {
let out_path = out_path.as_ref(); let out_path = out_path.as_ref();
@ -295,7 +286,7 @@ where
); );
} }
let bundle = build_package(&cfg, path).await.wrap_err_with(|| { let bundle = build_package(path, &cfg.dir).await.wrap_err_with(|| {
format!( format!(
"Failed to build package '{}' at '{}'", "Failed to build package '{}' at '{}'",
path.display(), path.display(),

View file

@ -1,174 +0,0 @@
use std::{io::Cursor, path::PathBuf};
use clap::{value_parser, Arg, ArgMatches, Command};
use color_eyre::{eyre::Context as _, Result};
use sdk::murmur::{HashGroup, IdString64, Murmur64};
use sdk::{BundleDatabase, FromBinary as _};
use tokio::fs;
pub(crate) fn command_definition() -> Command {
Command::new("db")
.about("Various operations regarding `bundle_database.data`.")
.subcommand_required(true)
.subcommand(
Command::new("list-files")
.about("List bundle contents")
.arg(
Arg::new("database")
.required(true)
.help("Path to the bundle database")
.value_parser(value_parser!(PathBuf)),
)
.arg(
Arg::new("bundle")
.help("The bundle name. If omitted, all bundles will be listed.")
.required(false),
),
)
.subcommand(
Command::new("list-bundles").about("List bundles").arg(
Arg::new("database")
.required(true)
.help("Path to the bundle database")
.value_parser(value_parser!(PathBuf)),
),
)
.subcommand(
Command::new("find-file")
.about("Find the bundle a file belongs to")
.arg(
Arg::new("database")
.required(true)
.help("Path to the bundle database")
.value_parser(value_parser!(PathBuf)),
)
.arg(
Arg::new("file-name")
.required(true)
.help("Name of the file. May be a hash in hex representation or a string"),
),
)
}
#[tracing::instrument(skip_all)]
pub(crate) async fn run(ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
let Some((op, sub_matches)) = matches.subcommand() else {
unreachable!("clap is configured to require a subcommand");
};
let database = {
let path = sub_matches
.get_one::<PathBuf>("database")
.expect("argument is required");
let binary = fs::read(&path)
.await
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let mut r = Cursor::new(binary);
BundleDatabase::from_binary(&mut r).wrap_err("Failed to parse bundle database")?
};
match op {
"list-files" => {
let index = database.files();
if let Some(bundle) = sub_matches.get_one::<String>("bundle") {
let hash = u64::from_str_radix(bundle, 16)
.map(Murmur64::from)
.wrap_err("Invalid hex sequence")?;
if let Some(files) = index.get(&hash) {
for file in files {
let name = ctx.lookup_hash(file.name, HashGroup::Filename);
let extension = file.extension.ext_name();
println!("{}.{}", name.display(), extension);
}
} else {
tracing::info!("Bundle {} not found in the database", bundle);
}
} else {
for (bundle_hash, files) in index.iter() {
let bundle_name = ctx.lookup_hash(*bundle_hash, HashGroup::Filename);
match bundle_name {
IdString64::String(name) => {
println!("{bundle_hash:016x} {name}");
}
IdString64::Hash(hash) => {
println!("{hash:016x}");
}
}
for file in files {
let name = ctx.lookup_hash(file.name, HashGroup::Filename);
let extension = file.extension.ext_name();
match name {
IdString64::String(name) => {
println!("\t{:016x}.{:<12} {}", file.name, extension, name);
}
IdString64::Hash(hash) => {
println!("\t{hash:016x}.{extension}");
}
}
}
println!();
}
}
Ok(())
}
"list-bundles" => {
for bundle_hash in database.bundles().keys() {
let bundle_name = ctx.lookup_hash(*bundle_hash, HashGroup::Filename);
match bundle_name {
IdString64::String(name) => {
println!("{bundle_hash:016x} {name}");
}
IdString64::Hash(hash) => {
println!("{hash:016x}");
}
}
}
Ok(())
}
"find-file" => {
let name = sub_matches
.get_one::<String>("file-name")
.expect("required argument");
let name = match u64::from_str_radix(name, 16).map(Murmur64::from) {
Ok(hash) => hash,
Err(_) => Murmur64::hash(name),
};
let bundles = database.files().iter().filter_map(|(bundle_hash, files)| {
if files.iter().any(|file| file.name == name) {
Some(bundle_hash)
} else {
None
}
});
let mut found = false;
for bundle in bundles {
found = true;
println!("{bundle:016x}");
}
if !found {
std::process::exit(1);
}
Ok(())
}
_ => unreachable!(
"clap is configured to require a subcommand, and they're all handled above"
),
}
}

View file

@ -150,7 +150,7 @@ async fn parse_command_line_template(tmpl: &String) -> Result<CmdLine> {
String::from_utf8_unchecked(bytes.to_vec()) String::from_utf8_unchecked(bytes.to_vec())
}); });
for arg in parsed.by_ref() { while let Some(arg) = parsed.next() {
// Safety: See above. // Safety: See above.
cmd.arg(unsafe { String::from_utf8_unchecked(arg.to_vec()) }); cmd.arg(unsafe { String::from_utf8_unchecked(arg.to_vec()) });
} }
@ -287,34 +287,6 @@ where
P1: AsRef<Path> + std::fmt::Debug, P1: AsRef<Path> + std::fmt::Debug,
P2: AsRef<Path> + std::fmt::Debug, P2: AsRef<Path> + std::fmt::Debug,
{ {
let ctx = if ctx.game_dir.is_some() {
tracing::debug!(
"Got game directory from config: {}",
ctx.game_dir.as_ref().unwrap().display()
);
ctx
} else {
let game_dir = path
.as_ref()
.parent()
.and_then(|parent| parent.parent())
.map(|p| p.to_path_buf());
tracing::info!(
"No game directory configured, guessing from bundle path: {:?}",
game_dir
);
Arc::new(sdk::Context {
game_dir,
lookup: Arc::clone(&ctx.lookup),
ljd: ctx.ljd.clone(),
revorb: ctx.revorb.clone(),
ww2ogg: ctx.ww2ogg.clone(),
})
};
let bundle = { let bundle = {
let data = fs::read(path.as_ref()).await?; let data = fs::read(path.as_ref()).await?;
let name = Bundle::get_name_from_path(&ctx, path.as_ref()); let name = Bundle::get_name_from_path(&ctx, path.as_ref());
@ -473,7 +445,7 @@ where
} }
} }
Err(err) => { Err(err) => {
let err = err.wrap_err(format!("Failed to decompile file {name}")); let err = err.wrap_err(format!("Failed to decompile file {}", name));
tracing::error!("{:?}", err); tracing::error!("{:?}", err);
} }
}; };

View file

@ -1,298 +1,112 @@
use std::path::{Path, PathBuf}; use std::path::PathBuf;
use std::str::FromStr as _;
use clap::{value_parser, Arg, ArgAction, ArgMatches, Command}; use clap::{value_parser, Arg, ArgMatches, Command};
use color_eyre::eyre::{self, Context, OptionExt, Result}; use color_eyre::eyre::{self, Context, Result};
use color_eyre::Help; use color_eyre::Help;
use path_slash::PathBufExt as _; use sdk::Bundle;
use sdk::murmur::IdString64; use tokio::fs::{self, File};
use sdk::{Bundle, BundleFile, BundleFileType}; use tokio::io::AsyncReadExt;
use tokio::fs;
pub(crate) fn command_definition() -> Command { pub(crate) fn command_definition() -> Command {
Command::new("inject") Command::new("inject")
.subcommand_required(true) .about("Inject a file into a bundle.")
.about("Inject a file into a bundle.\n\ .arg(
Raw binary data can be used to directly replace the file's variant data blob without affecting the metadata.\n\ Arg::new("replace")
Alternatively, a compiler format may be specified, and a complete bundle file is created.") .help("The name of a file in the bundle whos content should be replaced.")
.short('r')
.long("replace"),
)
.arg( .arg(
Arg::new("output") Arg::new("output")
.help( .help(
"The path to write the changed bundle to. \ "The path to write the changed bundle to. \
If omitted, the input bundle will be overwritten.\n\ If omitted, the input bundle will be overwritten.",
Remember to add a `.patch_<NUMBER>` suffix if you also use '--patch'.",
) )
.short('o') .short('o')
.long("output") .long("output")
.value_parser(value_parser!(PathBuf)), .value_parser(value_parser!(PathBuf)),
) )
.arg( .arg(
Arg::new("patch") Arg::new("bundle")
.help("Create a patch bundle. Optionally, a patch NUMBER may be specified as \ .help("Path to the bundle to inject the file into.")
'--patch=123'.\nThe maximum number is 999, the default is 1.\n\ .required(true)
If `--output` is not specified, the `.patch_<NUMBER>` suffix is added to \ .value_parser(value_parser!(PathBuf)),
the given bundle name.")
.short('p')
.long("patch")
.num_args(0..=1)
.require_equals(true)
.default_missing_value("1")
.value_name("NUMBER")
.value_parser(value_parser!(u16))
) )
.arg( .arg(
Arg::new("type") Arg::new("file")
.help("Compile the new file as the given TYPE. If omitted, the file type is \ .help("Path to the file to inject.")
is guessed from the file extension.") .required(true)
.value_name("TYPE") .value_parser(value_parser!(PathBuf)),
) )
.subcommand(
Command::new("replace")
.about("Replace an existing file in the bundle")
.arg(
Arg::new("variant")
.help("In combination with '--raw', specify the variant index to replace.")
.long("variant")
.default_value("0")
.value_parser(value_parser!(u8))
)
.arg(
Arg::new("raw")
.help("Insert the given file as raw binary data.\n\
Cannot be used with '--patch'.")
.long("raw")
.action(ArgAction::SetTrue)
)
.arg(
Arg::new("bundle")
.help("Path to the bundle to inject the file into.")
.required(true)
.value_parser(value_parser!(PathBuf)),
)
.arg(
Arg::new("bundle-file")
.help("The name of a file in the bundle whose content should be replaced.")
.required(true),
)
.arg(
Arg::new("new-file")
.help("Path to the file to inject.")
.required(true)
.value_parser(value_parser!(PathBuf)),
),
)
// .subcommand(
// Command::new("add")
// .about("Add a new file to the bundle")
// .arg(
// Arg::new("new-file")
// .help("Path to the file to inject.")
// .required(true)
// .value_parser(value_parser!(PathBuf)),
// )
// .arg(
// Arg::new("bundle")
// .help("Path to the bundle to inject the file into.")
// .required(true)
// .value_parser(value_parser!(PathBuf)),
// ),
// )
} }
#[tracing::instrument] #[tracing::instrument(skip_all)]
async fn compile_file(
path: impl AsRef<Path> + std::fmt::Debug,
name: impl Into<IdString64> + std::fmt::Debug,
file_type: BundleFileType,
) -> Result<BundleFile> {
let path = path.as_ref();
let file_data = fs::read(&path)
.await
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let _sjson = String::from_utf8(file_data)
.wrap_err_with(|| format!("Invalid UTF8 data in '{}'", path.display()))?;
let _root = path.parent().ok_or_eyre("File path has no parent")?;
eyre::bail!(
"Compilation for type '{}' is not implemented, yet",
file_type
)
}
#[tracing::instrument(
skip_all,
fields(
bundle_path = tracing::field::Empty,
in_file_path = tracing::field::Empty,
output_path = tracing::field::Empty,
target_name = tracing::field::Empty,
file_type = tracing::field::Empty,
raw = tracing::field::Empty,
)
)]
pub(crate) async fn run(ctx: sdk::Context, matches: &ArgMatches) -> Result<()> { pub(crate) async fn run(ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
let Some((op, sub_matches)) = matches.subcommand() else { let bundle_path = matches
unreachable!("clap is configured to require a subcommand, and they're all handled above");
};
let bundle_path = sub_matches
.get_one::<PathBuf>("bundle") .get_one::<PathBuf>("bundle")
.expect("required parameter not found"); .expect("required parameter not found");
let in_file_path = sub_matches let file_path = matches
.get_one::<PathBuf>("new-file") .get_one::<PathBuf>("file")
.expect("required parameter not found"); .expect("required parameter not found");
let patch_number = matches tracing::trace!(bundle_path = %bundle_path.display(), file_path = %file_path.display());
.get_one::<u16>("patch")
.map(|num| format!("{num:03}"));
let output_path = matches
.get_one::<PathBuf>("output")
.cloned()
.unwrap_or_else(|| {
let mut output_path = bundle_path.clone();
if let Some(patch_number) = patch_number.as_ref() {
output_path.set_extension(format!("patch_{patch_number:03}"));
}
output_path
});
let target_name = if op == "replace" {
sub_matches
.get_one::<String>("bundle-file")
.map(|name| match u64::from_str_radix(name, 16) {
Ok(id) => IdString64::from(id),
Err(_) => IdString64::String(name.clone()),
})
.expect("argument is required")
} else {
let mut path = PathBuf::from(in_file_path);
path.set_extension("");
IdString64::from(path.to_slash_lossy().to_string())
};
let file_type = if let Some(forced_type) = matches.get_one::<String>("type") {
BundleFileType::from_str(forced_type.as_str()).wrap_err("Unknown file type")?
} else {
in_file_path
.extension()
.and_then(|s| s.to_str())
.ok_or_eyre("File extension missing")
.and_then(BundleFileType::from_str)
.wrap_err("Unknown file type")
.with_suggestion(|| "Use '--type TYPE' to specify the file type")?
};
{
let span = tracing::Span::current();
if !span.is_disabled() {
span.record("bundle_path", bundle_path.display().to_string());
span.record("in_file_path", in_file_path.display().to_string());
span.record("output_path", output_path.display().to_string());
span.record("raw", sub_matches.get_flag("raw"));
span.record("target_name", target_name.display().to_string());
span.record("file_type", format!("{file_type:?}"));
}
}
let bundle_name = Bundle::get_name_from_path(&ctx, bundle_path);
let mut bundle = { let mut bundle = {
fs::read(bundle_path) let binary = fs::read(bundle_path).await?;
.await let name = Bundle::get_name_from_path(&ctx, bundle_path);
.map_err(From::from) Bundle::from_binary(&ctx, name, binary).wrap_err("Failed to open bundle file")?
.and_then(|binary| Bundle::from_binary(&ctx, bundle_name.clone(), binary))
.wrap_err_with(|| format!("Failed to open bundle '{}'", bundle_path.display()))?
}; };
let output_bundle = match op { if let Some(name) = matches.get_one::<String>("replace") {
"replace" => { let mut file = File::open(&file_path)
let Some(file) = bundle .await
.files_mut() .wrap_err_with(|| format!("Failed to open '{}'", file_path.display()))?;
.find(|file| *file.base_name() == target_name)
else {
let err = eyre::eyre!(
"No file with name '{}' in bundle '{}'",
target_name.display(),
bundle_path.display()
);
return Err(err).with_suggestion(|| { if let Some(variant) = bundle
.files_mut()
.filter(|file| file.matches_name(name.clone()))
// TODO: Handle file variants
.find_map(|file| file.variants_mut().next())
{
let mut data = Vec::new();
file.read_to_end(&mut data)
.await
.wrap_err("Failed to read input file")?;
variant.set_data(data);
} else {
let err = eyre::eyre!("No file '{}' in this bundle.", name)
.with_suggestion(|| {
format!( format!(
"Run '{} bundle list \"{}\"' to list the files in this bundle.", "Run '{} bundle list {}' to list the files in this bundle.",
clap::crate_name!(), clap::crate_name!(),
bundle_path.display() bundle_path.display()
) )
})
.with_suggestion(|| {
format!(
"Use '{} bundle inject --add {} {} {}' to add it as a new file",
clap::crate_name!(),
name,
bundle_path.display(),
file_path.display()
)
}); });
};
if sub_matches.get_flag("raw") { return Err(err);
let variant_index = sub_matches
.get_one::<u8>("variant")
.expect("argument with default missing");
let Some(variant) = file.variants_mut().nth(*variant_index as usize) else {
let err = eyre::eyre!(
"Variant index '{}' does not exist in '{}'",
variant_index,
target_name.display()
);
return Err(err).with_suggestion(|| {
format!(
"See '{} bundle inject add --help' if you want to add it as a new file",
clap::crate_name!(),
)
});
};
let data = tokio::fs::read(&in_file_path).await.wrap_err_with(|| {
format!("Failed to read file '{}'", in_file_path.display())
})?;
variant.set_data(data);
file.set_modded(true);
bundle
} else {
let mut bundle_file = compile_file(in_file_path, target_name.clone(), file_type)
.await
.wrap_err("Failed to compile")?;
bundle_file.set_modded(true);
if patch_number.is_some() {
let mut output_bundle = Bundle::new(bundle_name);
output_bundle.add_file(bundle_file);
output_bundle
} else {
*file = bundle_file;
dbg!(&file);
bundle
}
}
} }
"add" => {
unimplemented!("Implement adding a new file to the bundle.");
}
"copy" => {
unimplemented!("Implement copying a file from one bundle to the other.");
}
_ => unreachable!("no other operations exist"),
};
let data = output_bundle let out_path = matches.get_one::<PathBuf>("output").unwrap_or(bundle_path);
.to_binary() let data = bundle
.wrap_err("Failed to write changed bundle to output")?; .to_binary()
.wrap_err("Failed to write changed bundle to output")?;
fs::write(&output_path, &data) fs::write(out_path, &data)
.await .await
.wrap_err_with(|| format!("Failed to write data to '{}'", output_path.display()))?; .wrap_err("Failed to write data to output file")?;
tracing::info!("Modified bundle written to '{}'", output_path.display()); Ok(())
} else {
Ok(()) eyre::bail!("Currently, only the '--replace' operation is supported.");
}
} }

View file

@ -36,18 +36,6 @@ enum OutputFormat {
Text, Text,
} }
fn format_byte_size(size: usize) -> String {
if size < 1024 {
format!("{size} Bytes")
} else if size < 1024 * 1024 {
format!("{} kB", size / 1024)
} else if size < 1024 * 1024 * 1024 {
format!("{} MB", size / (1024 * 1024))
} else {
format!("{} GB", size / (1024 * 1024 * 1024))
}
}
#[tracing::instrument(skip(ctx))] #[tracing::instrument(skip(ctx))]
async fn print_bundle_contents<P>(ctx: &sdk::Context, path: P, fmt: OutputFormat) -> Result<()> async fn print_bundle_contents<P>(ctx: &sdk::Context, path: P, fmt: OutputFormat) -> Result<()>
where where
@ -62,11 +50,7 @@ where
match fmt { match fmt {
OutputFormat::Text => { OutputFormat::Text => {
println!( println!("Bundle: {}", bundle.name().display());
"Bundle: {} ({:016x})",
bundle.name().display(),
bundle.name()
);
for f in bundle.files().iter() { for f in bundle.files().iter() {
if f.variants().len() != 1 { if f.variants().len() != 1 {
@ -79,10 +63,9 @@ where
let v = &f.variants()[0]; let v = &f.variants()[0];
println!( println!(
"\t{}.{}: {} ({})", "\t{}.{}: {} bytes",
f.base_name().display(), f.base_name().display(),
f.file_type().ext_name(), f.file_type().ext_name(),
format_byte_size(v.size()),
v.size() v.size()
); );
} }

View file

@ -1,7 +1,6 @@
use clap::{ArgMatches, Command}; use clap::{ArgMatches, Command};
use color_eyre::eyre::Result; use color_eyre::eyre::Result;
mod db;
mod decompress; mod decompress;
mod extract; mod extract;
mod inject; mod inject;
@ -15,7 +14,6 @@ pub(crate) fn command_definition() -> Command {
.subcommand(extract::command_definition()) .subcommand(extract::command_definition())
.subcommand(inject::command_definition()) .subcommand(inject::command_definition())
.subcommand(list::command_definition()) .subcommand(list::command_definition())
.subcommand(db::command_definition())
} }
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
@ -25,7 +23,6 @@ pub(crate) async fn run(ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
Some(("extract", sub_matches)) => extract::run(ctx, sub_matches).await, Some(("extract", sub_matches)) => extract::run(ctx, sub_matches).await,
Some(("inject", sub_matches)) => inject::run(ctx, sub_matches).await, Some(("inject", sub_matches)) => inject::run(ctx, sub_matches).await,
Some(("list", sub_matches)) => list::run(ctx, sub_matches).await, Some(("list", sub_matches)) => list::run(ctx, sub_matches).await,
Some(("db", sub_matches)) => db::run(ctx, sub_matches).await,
_ => unreachable!( _ => unreachable!(
"clap is configured to require a subcommand, and they're all handled above" "clap is configured to require a subcommand, and they're all handled above"
), ),

View file

@ -1,5 +1,4 @@
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::Arc;
use clap::{value_parser, Arg, ArgAction, ArgMatches, Command, ValueEnum}; use clap::{value_parser, Arg, ArgAction, ArgMatches, Command, ValueEnum};
use cli_table::{print_stdout, WithTitle}; use cli_table::{print_stdout, WithTitle};
@ -157,8 +156,6 @@ pub(crate) async fn run(mut ctx: sdk::Context, matches: &ArgMatches) -> Result<(
BufReader::new(Box::new(f)) BufReader::new(Box::new(f))
}; };
let lookup = Arc::make_mut(&mut ctx.lookup);
let group = sdk::murmur::HashGroup::from(*group); let group = sdk::murmur::HashGroup::from(*group);
let mut added = 0; let mut added = 0;
@ -168,15 +165,15 @@ pub(crate) async fn run(mut ctx: sdk::Context, matches: &ArgMatches) -> Result<(
let total = { let total = {
for line in lines.into_iter() { for line in lines.into_iter() {
let value = line?; let value = line?;
if lookup.find(&value, group).is_some() { if ctx.lookup.find(&value, group).is_some() {
skipped += 1; skipped += 1;
} else { } else {
lookup.add(value, group); ctx.lookup.add(value, group);
added += 1; added += 1;
} }
} }
lookup.len() ctx.lookup.len()
}; };
let out_path = matches let out_path = matches
@ -193,7 +190,7 @@ pub(crate) async fn run(mut ctx: sdk::Context, matches: &ArgMatches) -> Result<(
}) })
.with_section(|| out_path.display().to_string().header("Path:"))?; .with_section(|| out_path.display().to_string().header("Path:"))?;
lookup ctx.lookup
.to_csv(f) .to_csv(f)
.await .await
.wrap_err("Failed to write dictionary to disk")?; .wrap_err("Failed to write dictionary to disk")?;
@ -230,12 +227,9 @@ pub(crate) async fn run(mut ctx: sdk::Context, matches: &ArgMatches) -> Result<(
let lookup = &ctx.lookup; let lookup = &ctx.lookup;
let rows: Vec<_> = lookup.entries().iter().map(TableRow::from).collect(); let rows: Vec<_> = lookup.entries().iter().map(TableRow::from).collect();
match print_stdout(rows.with_title()) { print_stdout(rows.with_title())?;
Ok(_) => Ok(()),
// Closing stdout prematurely is normal behavior with things like piping into `head` Ok(())
Err(err) if err.kind() == std::io::ErrorKind::BrokenPipe => Ok(()),
Err(err) => Err(err.into()),
}
} }
_ => unreachable!( _ => unreachable!(
"clap is configured to require a subcommand, and they're all handled above" "clap is configured to require a subcommand, and they're all handled above"

View file

@ -0,0 +1,520 @@
use std::collections::HashSet;
use std::fs;
use std::io::{BufWriter, Write};
use std::path::PathBuf;
use std::sync::Arc;
use std::thread::JoinHandle;
use clap::{value_parser, Arg, ArgAction, ArgMatches, Command};
use color_eyre::eyre::{self, Context};
use color_eyre::Result;
use crossbeam::channel::{bounded, unbounded, Receiver, Sender};
use itertools::Itertools;
use sdk::murmur::Murmur64;
use tokio::time::Instant;
pub(crate) fn command_definition() -> Command {
Command::new("brute-force-words")
.about(
"Given a list of words and a set of delimiters, iteratevily creates permutations \
of growing length.\n\
Delimiters are placed between every word in the result.\n\n\
Example: \
Given the words ['packages', 'boot'], the delimiters ['/', '_'] and a length of 2, the resulting \
words will be\n\
- packages\n\
- boot\n\
- packages/packages\n\
- packages_packages\n\
- packages/boot\n\
- packages_boot\n\
- boot/packages\n\
- boot_packages\n\
- boot/boot\n\
- boot_boot",
)
.arg(
Arg::new("delimiter")
.help(
"The delimiters to put between the words. \
All permutations of this list will be tried for every string of words.\n\
Specify multiple times to set multiple values.\n\
Defaults to ['/', '_'].",
)
.short('d')
.long("delimiter")
.action(ArgAction::Append),
)
.arg(
Arg::new("max-length")
.help("The maximum number of words up to which to build strings.")
.long("max")
.long("max-length")
.short('m')
.default_value("5")
.value_parser(value_parser!(usize)),
)
.arg(
Arg::new("continue")
.help("Can be used to continue a previous operation where it stopped. Word list and delimiters must match.")
.short('c')
.long("continue")
)
.arg(
Arg::new("threads")
.help("The number of workers to run in parallel.")
.long("threads")
.short('n')
.default_value("6")
.value_parser(value_parser!(usize))
)
.arg(
Arg::new("words")
.help("Path to a file containing words line by line.")
.required(true)
.value_parser(value_parser!(PathBuf)),
)
.arg(
Arg::new("hashes")
.help(
"Path to a file containing the hashes to attempt to brute force. \
Hashes are expected in hexadecimal notiation. \
Only 64-bit hashes are supported."
)
.required(true)
.value_parser(value_parser!(PathBuf)),
)
}
const LINE_FEED: u8 = 0x0A;
const UNDERSCORE: u8 = 0x5F;
const ZERO: u8 = 0x30;
const PREFIXES: [&str; 36] = [
"",
"content/characters/",
"content/debug/",
"content/decals/",
"content/environment/",
"content/fx/",
"content/fx/particles/",
"content/gizmos/",
"content/items/",
"content/levels/",
"content/liquid_area/",
"content/localization/",
"content/materials/",
"content/minion_impact_assets/",
"content/pickups/",
"content/shading_environments/",
"content/textures/",
"content/ui/",
"content/videos/",
"content/vo/",
"content/volume_types/",
"content/weapons/",
"content/",
"core/",
"core/units/",
"packages/boot_assets/",
"packages/content/",
"packages/game_scripts/",
"packages/strings/",
"packages/ui/",
"packages/",
"wwise/events/",
"wwise/packages/",
"wwise/world_sound_fx/",
"wwise/events/weapons/",
"wwise/events/minions/",
];
fn make_info_printer(rx: Receiver<(usize, usize, String)>, hash_count: usize) -> JoinHandle<()> {
std::thread::spawn(move || {
let mut writer = std::io::stderr();
let mut total_count = 0;
let mut total_found = 0;
let mut start = Instant::now();
while let Ok((count, found, last)) = rx.recv() {
total_count += count;
total_found += found;
let now = Instant::now();
if (now - start).as_millis() > 250 {
let s = &last[0..std::cmp::min(last.len(), 60)];
let s = format!(
"\r{:12} per second | {total_found:6}/{hash_count} found | {s:<60}",
total_count * 4
);
writer.write_all(s.as_bytes()).unwrap();
total_count = 0;
start = now;
}
}
})
}
fn make_stdout_printer(rx: Receiver<Vec<u8>>) -> JoinHandle<()> {
std::thread::spawn(move || {
let mut writer = std::io::stdout();
while let Ok(buf) = rx.recv() {
writer.write_all(&buf).unwrap();
}
})
}
struct State {
delimiter_lists: Arc<Vec<Vec<String>>>,
hashes: Arc<HashSet<Murmur64>>,
words: Arc<Vec<String>>,
delimiters_len: usize,
stdout_tx: Sender<Vec<u8>>,
info_tx: Sender<(usize, usize, String)>,
}
fn make_worker(rx: Receiver<Vec<usize>>, state: State) -> JoinHandle<()> {
std::thread::spawn(move || {
let delimiter_lists = &state.delimiter_lists;
let hashes = &state.hashes;
let words = &state.words;
let delimiters_len = state.delimiters_len;
let mut count = 0;
let mut found = 0;
let mut buf = Vec::with_capacity(1024);
while let Ok(indices) = rx.recv() {
let sequence = indices.iter().map(|i| words[*i].as_str());
// We only want delimiters between words, so we keep that iterator shorter by
// one.
let delimiter_count = sequence.len() as u32 - 1;
for prefix in PREFIXES.iter().map(|p| p.as_bytes()) {
buf.clear();
// We can keep the prefix at the front of the buffer and only
// replace the parts after that.
let prefix_len = prefix.len();
buf.extend_from_slice(prefix);
for delims in delimiter_lists
.iter()
.take(delimiters_len.pow(delimiter_count))
{
buf.truncate(prefix_len);
let delims = delims
.iter()
.map(|s| s.as_str())
.take(delimiter_count as usize);
sequence
.clone()
.interleave(delims.clone())
.for_each(|word| buf.extend_from_slice(word.as_bytes()));
count += 1;
let hash = Murmur64::hash(&buf);
if hashes.contains(&hash) {
found += 1;
buf.push(LINE_FEED);
if state.stdout_tx.send(buf.clone()).is_err() {
return;
}
} else {
let word_len = buf.len();
// If the regular word itself didn't match, we check
// for numbered suffixes.
// For now, we only check up to `09` to avoid more complex logic
// writing into the buffer.
// Packages that contain files with higher numbers than this
// should hopefully become easier to spot once a good number of
// hashes is found.
for i in 1..=9 {
buf.truncate(word_len);
buf.push(UNDERSCORE);
buf.push(ZERO);
buf.push(ZERO + i);
count += 1;
let hash = Murmur64::hash(&buf);
if hashes.contains(&hash) {
found += 1;
buf.push(LINE_FEED);
if state.stdout_tx.send(buf.clone()).is_err() {
return;
}
} else {
break;
}
}
}
}
}
if count >= 2 * 1024 * 1024 {
// The last prefix in the set is the one that will stay in the buffer
// when we're about to print here.
// So we strip that, to show just the generated part.
// We also restrict the length to stay on a single line.
let prefix_len = PREFIXES[35].len();
// No need to wait for this
let _ = state.info_tx.try_send((
count,
found,
String::from_utf8_lossy(&buf[prefix_len..]).to_string(),
));
count = 0;
found = 0;
}
}
})
}
fn build_delimiter_lists(delimiters: impl AsRef<[String]>, max_length: usize) -> Vec<Vec<String>> {
let delimiters = delimiters.as_ref();
let mut indices = vec![0; max_length];
let mut list = Vec::new();
for _ in 0..delimiters.len().pow(max_length as u32) {
list.push(
indices
.iter()
.map(|i| delimiters[*i].clone())
.collect::<Vec<_>>(),
);
for v in indices.iter_mut() {
if *v >= delimiters.len() - 1 {
*v = 0;
break;
} else {
*v += 1;
}
}
}
list
}
fn build_initial_indices(
cont: Option<&String>,
delimiters: impl AsRef<[String]>,
words: impl AsRef<[String]>,
) -> Result<Vec<usize>> {
if let Some(cont) = cont {
let mut splits = vec![cont.clone()];
for delim in delimiters.as_ref().iter() {
splits = splits
.iter()
.flat_map(|s| s.split(delim))
.map(|s| s.to_string())
.collect();
}
let indices = splits
.into_iter()
.map(|s| {
words
.as_ref()
.iter()
.enumerate()
.find(|(_, v)| s == **v)
.map(|(i, _)| i)
.ok_or_else(|| eyre::eyre!("'{}' is not in the word list", s))
})
.collect::<Result<_>>()?;
tracing::info!("Continuing from '{}' -> '{:?}'", cont, &indices);
Ok(indices)
} else {
Ok(vec![0])
}
}
#[tracing::instrument(skip_all)]
#[allow(clippy::mut_range_bound)]
pub(crate) fn run(_ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
let max_length: usize = matches
.get_one::<usize>("max-length")
.copied()
.expect("parameter has default");
let num_threads: usize = matches
.get_one::<usize>("threads")
.copied()
.expect("parameter has default");
let words = {
let path = matches
.get_one::<PathBuf>("words")
.expect("missing required parameter");
let file = fs::read_to_string(path)
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let words: Vec<_> = file.lines().map(str::to_string).collect();
if words.is_empty() {
eyre::bail!("Word list must not be empty");
}
Arc::new(words)
};
let hashes = {
let path = matches
.get_one::<PathBuf>("hashes")
.expect("missing required argument");
let content = fs::read_to_string(path)
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let hashes: Result<HashSet<_>, _> = content
.lines()
.map(|s| u64::from_str_radix(s, 16).map(Murmur64::from))
.collect();
let hashes = hashes?;
tracing::trace!("{:?}", hashes);
Arc::new(hashes)
};
let mut delimiters: Vec<String> = matches
.get_many::<String>("delimiter")
.unwrap_or_default()
.cloned()
.collect();
if delimiters.is_empty() {
delimiters.push(String::from("/"));
delimiters.push(String::from("_"));
}
let delimiters_len = delimiters.len();
let word_count = words.len();
tracing::info!("{} words to try", word_count);
// To be able to easily combine the permutations of words and delimiters,
// we turn the latter into a pre-defined list of all permutations of delimiters
// that are possible at the given amount of words.
// Combining `Iterator::cycle` with `Itertools::permutations` works, but
// with a high `max_length`, it runs OOM.
// So we basically have to implement a smaller version of the iterative algorithm we use later on
// to build permutations of the actual words.
let delimiter_lists = {
let lists = build_delimiter_lists(&delimiters, max_length - 1);
Arc::new(lists)
};
tracing::debug!("{:?}", delimiter_lists);
let (info_tx, info_rx) = bounded(100);
let (stdout_tx, stdout_rx) = unbounded::<Vec<u8>>();
let (task_tx, task_rx) = bounded::<Vec<usize>>(num_threads * 4);
let mut handles = Vec::new();
for _ in 0..num_threads {
let handle = make_worker(
task_rx.clone(),
State {
delimiter_lists: Arc::clone(&delimiter_lists),
hashes: Arc::clone(&hashes),
words: Arc::clone(&words),
delimiters_len,
stdout_tx: stdout_tx.clone(),
info_tx: info_tx.clone(),
},
);
handles.push(handle);
}
// These are only used inside the worker threads, but due to the loops above, we had to
// clone them one too many times.
// So we drop that extra reference immediately, to ensure that the channels can
// disconnect properly when the threads finish.
drop(stdout_tx);
drop(info_tx);
handles.push(make_info_printer(info_rx, hashes.len()));
handles.push(make_stdout_printer(stdout_rx));
let mut indices =
build_initial_indices(matches.get_one::<String>("continue"), &delimiters, &*words)
.wrap_err("Failed to build initial indices")?;
let mut indices_len = indices.len();
let mut sequence = indices
.iter()
.map(|index| words[*index].as_str())
.collect::<Vec<_>>();
// Prevent re-allocation by reserving as much as we need upfront
indices.reserve(max_length);
sequence.reserve(max_length);
'outer: loop {
task_tx.send(indices.clone())?;
for i in 0..indices_len {
let index = indices.get_mut(i).unwrap();
let word = sequence.get_mut(i).unwrap();
if *index >= word_count - 1 {
*index = 0;
*word = words[*index].as_str();
if indices.get(i + 1).is_none() {
indices_len += 1;
if indices_len > max_length {
break 'outer;
}
indices.push(0);
sequence.push(words[0].as_str());
break;
}
} else {
*index += 1;
*word = words[*index].as_str();
break;
}
}
}
// Dropping the senders will disconnect the channel,
// so that the threads holding the other end will eventually
// complete as well.
drop(task_tx);
for handle in handles {
match handle.join() {
Ok(_) => {}
Err(value) => {
if let Some(err) = value.downcast_ref::<String>() {
eyre::bail!("Thread failed: {}", err);
} else {
eyre::bail!("Thread failed with unknown error: {:?}", value);
}
}
}
}
let _ = std::io::stdout().write_all("\r".as_bytes());
Ok(())
}

View file

@ -0,0 +1,463 @@
use std::collections::HashMap;
use std::path::PathBuf;
use clap::{value_parser, Arg, ArgMatches, Command, ValueEnum};
use color_eyre::eyre::Context;
use color_eyre::Result;
use tokio::fs;
pub(crate) fn command_definition() -> Command {
Command::new("extract-words")
.about(
"Extract unique alphanumeric sequences that match common identifier rules from the given file. \
Only ASCII is supported.",
)
.arg(
Arg::new("file")
.required(true)
.value_parser(value_parser!(PathBuf))
.help("Path to the file to extract words from."),
)
.arg(
Arg::new("min-length")
.help("Minimum length to consider a word.")
.long("min-length")
.short('m')
.default_value("3")
.value_parser(value_parser!(usize))
)
.arg(
Arg::new("algorithm")
.help("The algorithm to determine matching words")
.long("algorithm")
.short('a')
.default_value("identifier")
.value_parser(value_parser!(Algorithm))
)
}
#[derive(Copy, Clone, Debug, Eq, PartialEq, ValueEnum)]
#[value(rename_all = "snake_case")]
enum Algorithm {
Alphabetic,
Alphanumeric,
Identifier,
Number,
Hash32,
Hash64,
Paths,
}
impl Algorithm {
fn is_start(&self, c: char) -> bool {
match self {
Self::Alphabetic => c.is_ascii_alphabetic(),
Self::Alphanumeric => c.is_ascii_alphanumeric(),
Self::Identifier => c.is_ascii_alphabetic(),
Self::Number => c.is_numeric(),
Self::Hash32 | Self::Hash64 => matches!(c, '0'..='9' | 'a'..='f' | 'A'..='F'),
// Supposed to be handled separately
Self::Paths => false,
}
}
fn is_body(&self, c: char) -> bool {
match self {
Self::Alphabetic => c.is_ascii_alphabetic(),
Self::Alphanumeric => c.is_ascii_alphanumeric(),
Self::Identifier => c.is_ascii_alphanumeric(),
Self::Number => c.is_numeric(),
Self::Hash32 | Self::Hash64 => matches!(c, '0'..='9' | 'a'..='f' | 'A'..='F'),
// Supposed to be handled separately
Self::Paths => false,
}
}
fn is_length(&self, len: usize) -> bool {
match self {
Self::Alphabetic => true,
Self::Alphanumeric => true,
Self::Identifier => true,
Self::Number => true,
Self::Hash32 => len == 8,
Self::Hash64 => len == 16,
// Supposed to be handled separately
Self::Paths => false,
}
}
}
impl std::fmt::Display for Algorithm {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(
f,
"{}",
match self {
Algorithm::Alphabetic => "alphabetic",
Algorithm::Alphanumeric => "alphanumeric",
Algorithm::Identifier => "identifier",
Algorithm::Number => "number",
Algorithm::Hash32 => "hash32",
Algorithm::Hash64 => "hash64",
Algorithm::Paths => "paths",
}
)
}
}
#[derive(Copy, Clone, Debug)]
enum PathState {
Begin,
PathComponent,
PathSeparator,
Boundary,
NonWord,
End,
}
#[tracing::instrument(skip(chars))]
fn extract_paths(chars: impl Iterator<Item = char>) -> Vec<Vec<String>> {
let mut chars = chars.peekable();
let mut state = PathState::Begin;
let mut list = Vec::new();
let mut path = Vec::new();
let mut word = String::new();
let is_boundary = |c: char| c == '\n' || c == ' ' || c == ',' || c == '\t' || c == '|';
'machine: loop {
state = match state {
PathState::Begin => match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphabetic() => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => PathState::Boundary,
Some('/') => PathState::PathSeparator,
Some(_) => PathState::NonWord,
},
PathState::PathComponent => match chars.next() {
None => {
path.push(word.clone());
list.push(path.clone());
PathState::End
}
Some(c) if c.is_ascii_alphanumeric() || c == '_' => {
word.push(c);
PathState::PathComponent
}
Some('/') => {
path.push(word.clone());
word.clear();
PathState::PathSeparator
}
Some(c) if is_boundary(c) => {
path.push(word.clone());
list.push(path.clone());
path.clear();
word.clear();
PathState::Boundary
}
Some(_) => {
list.push(path.clone());
path.clear();
word.clear();
PathState::NonWord
}
},
PathState::PathSeparator => match chars.next() {
None => {
list.push(path.clone());
PathState::End
}
Some('/') => PathState::PathSeparator,
Some(c) if c.is_ascii_alphabetic() || c == '_' => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => {
list.push(path.clone());
path.clear();
PathState::Boundary
}
Some(_) => {
list.push(path.clone());
path.clear();
PathState::NonWord
}
},
PathState::Boundary => match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphabetic() => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => PathState::Boundary,
Some(_) => PathState::NonWord,
},
PathState::NonWord => match chars.next() {
None => PathState::End,
Some(c) if is_boundary(c) => PathState::Boundary,
Some(_) => PathState::NonWord,
},
PathState::End => {
break 'machine;
}
}
}
list
}
#[tracing::instrument(skip(chars))]
fn algorithm_path_components(chars: impl Iterator<Item = char>, min_length: usize) {
let mut chars = chars.peekable();
let mut state = PathState::Begin;
let mut word = String::new();
let mut lists = vec![HashMap::<String, usize>::new()];
let mut index = 0;
let is_boundary = |c: char| c == '\n' || c == ' ' || c == ',' || c == '\t';
'machine: loop {
state = match state {
PathState::Begin => match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphabetic() => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => PathState::Boundary,
// Ignore leading path separators to not trigger the logic of advancing
// the component count
Some('/') => PathState::Boundary,
Some(_) => PathState::NonWord,
},
PathState::PathComponent => match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphanumeric() || c == '_' => {
word.push(c);
PathState::PathComponent
}
Some('/') => PathState::PathSeparator,
Some(c) => {
if index > 0 && word.len() >= min_length {
let list = &mut lists[index];
list.entry(word.clone())
.and_modify(|count| *count += 1)
.or_insert(1);
}
word.clear();
index = 0;
if is_boundary(c) {
PathState::Boundary
} else {
PathState::NonWord
}
}
},
PathState::PathSeparator => {
if word.len() >= min_length {
let list = &mut lists[index];
list.entry(word.clone())
.and_modify(|count| *count += 1)
.or_insert(1);
}
word.clear();
index += 1;
if lists.get(index).is_none() {
lists.push(HashMap::new());
}
// Ignore multiple separators
while chars.next_if(|c| *c == '/').is_some() {}
match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphabetic() || c == '_' => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => {
index = 0;
PathState::Boundary
}
Some(_) => {
index = 0;
PathState::NonWord
}
}
}
PathState::Boundary => match chars.next() {
None => PathState::End,
Some(c) if c.is_ascii_alphabetic() => {
word.push(c);
PathState::PathComponent
}
Some(c) if is_boundary(c) => PathState::Boundary,
Some(_) => PathState::NonWord,
},
PathState::NonWord => match chars.next() {
None => PathState::End,
Some(c) if is_boundary(c) => PathState::Boundary,
Some(_) => PathState::NonWord,
},
PathState::End => {
if word.len() >= min_length {
let list = &mut lists[index];
list.entry(word.clone())
.and_modify(|count| *count += 1)
.or_insert(1);
}
break 'machine;
}
}
}
for i in 0..lists.len() {
print!("Word {i}, Count {i},");
}
println!();
let mut lines: Vec<Vec<Option<(String, usize)>>> = Vec::new();
for (i, list) in lists.into_iter().enumerate() {
let mut entries = list.into_iter().collect::<Vec<_>>();
entries.sort_by(|(_, a), (_, b)| b.partial_cmp(a).unwrap());
for (j, (word, count)) in entries.into_iter().enumerate() {
if let Some(line) = lines.get_mut(j) {
while line.len() < i {
line.push(None);
}
line.push(Some((word, count)));
} else {
let mut line = Vec::new();
while line.len() < i {
line.push(None);
}
line.push(Some((word, count)));
lines.push(line);
}
}
}
for line in lines.iter() {
for cell in line.iter() {
if let Some((word, count)) = cell {
print!("{},{},", word, count);
} else {
print!(",,");
}
}
println!();
}
}
#[derive(Copy, Clone, Debug)]
enum State {
Begin,
NonWord,
Word,
End,
}
#[tracing::instrument(skip_all)]
pub(crate) async fn run(_ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
let path = matches
.get_one::<PathBuf>("file")
.expect("missing required parameter");
let algorithm = matches
.get_one::<Algorithm>("algorithm")
.expect("parameter has default");
let min_length = matches
.get_one::<usize>("min-length")
.copied()
.expect("paramter has default");
let content = fs::read_to_string(&path)
.await
.wrap_err_with(|| format!("Failed to read file '{}'", path.display()))?;
let mut chars = content.chars();
if *algorithm == Algorithm::Paths {
algorithm_path_components(chars, min_length);
return Ok(());
}
let mut state = State::Begin;
let mut word = String::new();
let mut visited = HashMap::new();
'machine: loop {
state = match state {
State::Begin => match chars.next() {
None => State::End,
Some(c) if algorithm.is_start(c) => {
word.push(c);
State::Word
}
Some(_) => State::NonWord,
},
State::End => break 'machine,
State::NonWord => match chars.next() {
None => State::End,
Some(c) if algorithm.is_body(c) => {
word.push(c);
State::Word
}
Some(_) => State::NonWord,
},
State::Word => match chars.next() {
None => {
if word.len() >= min_length && algorithm.is_length(word.len()) {
visited
.entry(word.clone())
.and_modify(|v| *v += 1)
.or_insert(1);
}
State::End
}
Some(c) if algorithm.is_body(c) => {
word.push(c);
State::Word
}
Some(_) => {
if word.len() >= min_length && algorithm.is_length(word.len()) {
visited
.entry(word.clone())
.and_modify(|v| *v += 1)
.or_insert(1);
}
word.clear();
State::NonWord
}
},
}
}
let mut entries: Vec<(String, usize)> = visited.into_iter().collect();
// Reverse sides during comparison to get "highest to lowest"
entries.sort_by(|(_, a), (_, b)| b.partial_cmp(a).unwrap());
entries
.iter()
.for_each(|(word, count)| println!("{:016} {}", word, count));
Ok(())
}

View file

@ -0,0 +1,26 @@
use clap::{ArgMatches, Command};
use color_eyre::Result;
mod brute_force_words;
mod extract_words;
pub(crate) fn command_definition() -> Command {
Command::new("experiment")
.subcommand_required(true)
.about("A collection of utilities and experiments.")
.subcommand(brute_force_words::command_definition())
.subcommand(extract_words::command_definition())
}
#[tracing::instrument(skip_all)]
pub(crate) async fn run(ctx: sdk::Context, matches: &ArgMatches) -> Result<()> {
match matches.subcommand() {
// It's fine to block here, as this is the only thing that's executing on the runtime.
// The other option with `spawn_blocking` would require setting up values to be Send+Sync.
Some(("brute-force-words", sub_matches)) => brute_force_words::run(ctx, sub_matches),
Some(("extract-words", sub_matches)) => extract_words::run(ctx, sub_matches).await,
_ => unreachable!(
"clap is configured to require a subcommand, and they're all handled above"
),
}
}

View file

@ -351,7 +351,6 @@ pub(crate) async fn run(_ctx: sdk::Context, matches: &ArgMatches) -> Result<()>
}, },
depends: vec![ModDependency::ID(String::from("DMF"))], depends: vec![ModDependency::ID(String::from("DMF"))],
bundled: true, bundled: true,
name_overrides: HashMap::new(),
}; };
tracing::debug!(?dtmt_cfg); tracing::debug!(?dtmt_cfg);

View file

@ -1,30 +1,18 @@
use std::collections::HashMap;
use std::path::PathBuf; use std::path::PathBuf;
use clap::{Arg, ArgMatches, Command}; use clap::{Arg, ArgMatches, Command};
use color_eyre::eyre::{self, Context, Result}; use color_eyre::eyre::{self, Context, Result};
use color_eyre::Help; use color_eyre::Help;
use futures::{StreamExt, TryStreamExt}; use futures::{StreamExt, TryStreamExt};
use minijinja::Environment; use string_template::Template;
use tokio::fs::{self, DirBuilder}; use tokio::fs::{self, DirBuilder};
const TEMPLATES: [(&str, &str); 5] = [ const TEMPLATES: [(&str, &str); 5] = [
( (
"dtmt.cfg", "dtmt.cfg",
r#"// r#"id = "{{id}}"
// This is your mod's main configuration file. It tells DTMT how to build the mod,
// and DTMM what to display to your users.
// Certain files have been pre-filled by the template, the ones commented out (`//`)
// are optional.
//
// A unique identifier (preferably lower case, alphanumeric)
id = "{{id}}"
// The display name that your users will see.
// This doesn't have to be unique, but you still want to avoid being confused with other
// mods.
name = "{{name}}" name = "{{name}}"
// It's good practice to increase this number whenever you publish changes.
// It's up to you if you use SemVer or something simpler like `1970-12-24`. It should sort and
// compare well, though.
version = "0.1.0" version = "0.1.0"
// author = "" // author = ""
@ -44,25 +32,16 @@ categories = [
// A list of mod IDs that this mod depends on. You can find // A list of mod IDs that this mod depends on. You can find
// those IDs by downloading the mod and extracting their `dtmt.cfg`. // those IDs by downloading the mod and extracting their `dtmt.cfg`.
// To make your fellow modders' lives easier, publish your own mods' IDs
// somewhere visible, such as the Nexusmods page.
depends = [ depends = [
DMF DMF
] ]
// The primary resources that serve as the entry point to your
// mod's code. Unless for very specific use cases, the generated
// values shouldn't be changed.
resources = { resources = {
init = "scripts/mods/{{id}}/init" init = "scripts/mods/{{id}}/init"
data = "scripts/mods/{{id}}/data" data = "scripts/mods/{{id}}/data"
localization = "scripts/mods/{{id}}/localization" localization = "scripts/mods/{{id}}/localization"
} }
// The list of packages, or bundles, to build.
// Each one corresponds to a package definition in the named folder.
// For mods that contain only code and/or a few small assets, a single
// package will suffice.
packages = [ packages = [
"packages/mods/{{id}}" "packages/mods/{{id}}"
] ]
@ -80,6 +59,7 @@ packages = [
r#"local mod = get_mod("{{id}}") r#"local mod = get_mod("{{id}}")
-- Your mod code goes here. -- Your mod code goes here.
-- https://vmf-docs.verminti.de
"#, "#,
), ),
( (
@ -157,45 +137,34 @@ pub(crate) async fn run(_ctx: sdk::Context, matches: &ArgMatches) -> Result<()>
tracing::debug!(root = %root.display(), name, id); tracing::debug!(root = %root.display(), name, id);
let render_ctx = minijinja::context!(name => name.as_str(), id => id.as_str()); let mut data = HashMap::new();
let env = Environment::new(); data.insert("name", name.as_str());
data.insert("id", id.as_str());
let templates = TEMPLATES let templates = TEMPLATES
.iter() .iter()
.map(|(path_tmpl, content_tmpl)| { .map(|(path_tmpl, content_tmpl)| {
env.render_str(path_tmpl, &render_ctx) let path = Template::new(path_tmpl).render(&data);
.wrap_err_with(|| format!("Failed to render template: {path_tmpl}")) let content = Template::new(content_tmpl).render(&data);
.and_then(|path| {
env.render_named_str(&path, content_tmpl, &render_ctx) (root.join(path), content)
.wrap_err_with(|| format!("Failed to render template '{}'", &path))
.map(|content| (root.join(path), content))
})
}) })
.map(|res| async move { .map(|(path, content)| async move {
match res { let dir = path
Ok((path, content)) => { .parent()
let dir = path .ok_or_else(|| eyre::eyre!("invalid root path"))?;
.parent()
.ok_or_else(|| eyre::eyre!("invalid root path"))?;
DirBuilder::new() DirBuilder::new()
.recursive(true) .recursive(true)
.create(&dir) .create(&dir)
.await .await
.wrap_err_with(|| { .wrap_err_with(|| format!("Failed to create directory {}", dir.display()))?;
format!("Failed to create directory {}", dir.display())
})?;
tracing::trace!("Writing file {}", path.display()); tracing::trace!("Writing file {}", path.display());
fs::write(&path, content.as_bytes()) fs::write(&path, content.as_bytes())
.await .await
.wrap_err_with(|| { .wrap_err_with(|| format!("Failed to write content to path {}", path.display()))
format!("Failed to write content to path {}", path.display())
})
}
Err(e) => Err(e),
}
}); });
futures::stream::iter(templates) futures::stream::iter(templates)

View file

@ -77,14 +77,17 @@ pub(crate) fn command_definition() -> Command {
) )
} }
#[tracing::instrument] async fn compile<P1, P2, P3>(
async fn compile(
cfg: &ModConfig, cfg: &ModConfig,
out_path: impl AsRef<Path> + std::fmt::Debug, out_path: P1,
archive_path: impl AsRef<Path> + std::fmt::Debug, archive_path: P2,
game_dir: Arc<Option<impl AsRef<Path> + std::fmt::Debug>>, game_dir: Arc<Option<P3>>,
) -> Result<()> { ) -> Result<()>
let out_path = out_path.as_ref(); where
P1: AsRef<Path> + std::marker::Copy,
P2: AsRef<Path>,
P3: AsRef<Path>,
{
build(cfg, out_path, game_dir) build(cfg, out_path, game_dir)
.await .await
.wrap_err("Failed to build bundles")?; .wrap_err("Failed to build bundles")?;

View file

@ -1,5 +1,6 @@
#![feature(io_error_more)] #![feature(io_error_more)]
#![feature(let_chains)] #![feature(let_chains)]
#![feature(result_flattening)]
#![feature(test)] #![feature(test)]
#![windows_subsystem = "console"] #![windows_subsystem = "console"]
@ -11,7 +12,6 @@ use clap::value_parser;
use clap::{command, Arg}; use clap::{command, Arg};
use color_eyre::eyre; use color_eyre::eyre;
use color_eyre::eyre::{Context, Result}; use color_eyre::eyre::{Context, Result};
use sdk::murmur::Dictionary;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tokio::fs::File; use tokio::fs::File;
use tokio::io::BufReader; use tokio::io::BufReader;
@ -21,6 +21,7 @@ mod cmd {
pub mod build; pub mod build;
pub mod bundle; pub mod bundle;
pub mod dictionary; pub mod dictionary;
pub mod experiment;
pub mod migrate; pub mod migrate;
pub mod murmur; pub mod murmur;
pub mod new; pub mod new;
@ -36,21 +37,10 @@ struct GlobalConfig {
} }
#[tokio::main] #[tokio::main]
#[tracing::instrument(level = "error", fields(cmd_line = tracing::field::Empty))] #[tracing::instrument]
async fn main() -> Result<()> { async fn main() -> Result<()> {
color_eyre::install()?; color_eyre::install()?;
{
let span = tracing::Span::current();
if !span.is_disabled() {
let cmdline: String = std::env::args_os().fold(String::new(), |mut s, arg| {
s.push_str(&arg.to_string_lossy());
s
});
span.record("cmd_line", cmdline);
}
}
let matches = command!() let matches = command!()
.subcommand_required(true) .subcommand_required(true)
.arg( .arg(
@ -67,6 +57,7 @@ async fn main() -> Result<()> {
.subcommand(cmd::build::command_definition()) .subcommand(cmd::build::command_definition())
.subcommand(cmd::bundle::command_definition()) .subcommand(cmd::bundle::command_definition())
.subcommand(cmd::dictionary::command_definition()) .subcommand(cmd::dictionary::command_definition())
.subcommand(cmd::experiment::command_definition())
.subcommand(cmd::migrate::command_definition()) .subcommand(cmd::migrate::command_definition())
.subcommand(cmd::murmur::command_definition()) .subcommand(cmd::murmur::command_definition())
.subcommand(cmd::new::command_definition()) .subcommand(cmd::new::command_definition())
@ -107,9 +98,8 @@ async fn main() -> Result<()> {
let r = BufReader::new(f); let r = BufReader::new(f);
let mut ctx = ctx.write().await; let mut ctx = ctx.write().await;
match Dictionary::from_csv(r).await { if let Err(err) = ctx.lookup.from_csv(r).await {
Ok(lookup) => ctx.lookup = Arc::new(lookup), tracing::error!("{:#}", err);
Err(err) => tracing::error!("{:#}", err),
} }
}) })
}; };
@ -145,6 +135,7 @@ async fn main() -> Result<()> {
Some(("build", sub_matches)) => cmd::build::run(ctx, sub_matches).await?, Some(("build", sub_matches)) => cmd::build::run(ctx, sub_matches).await?,
Some(("bundle", sub_matches)) => cmd::bundle::run(ctx, sub_matches).await?, Some(("bundle", sub_matches)) => cmd::bundle::run(ctx, sub_matches).await?,
Some(("dictionary", sub_matches)) => cmd::dictionary::run(ctx, sub_matches).await?, Some(("dictionary", sub_matches)) => cmd::dictionary::run(ctx, sub_matches).await?,
Some(("experiment", sub_matches)) => cmd::experiment::run(ctx, sub_matches).await?,
Some(("migrate", sub_matches)) => cmd::migrate::run(ctx, sub_matches).await?, Some(("migrate", sub_matches)) => cmd::migrate::run(ctx, sub_matches).await?,
Some(("murmur", sub_matches)) => cmd::murmur::run(ctx, sub_matches).await?, Some(("murmur", sub_matches)) => cmd::murmur::run(ctx, sub_matches).await?,
Some(("new", sub_matches)) => cmd::new::run(ctx, sub_matches).await?, Some(("new", sub_matches)) => cmd::new::run(ctx, sub_matches).await?,

View file

@ -54,11 +54,17 @@ impl<'a> ShellParser<'a> {
} }
_ => {} _ => {}
}, },
ParserState::SingleQuote => if c == b'\'' { ParserState::SingleQuote => match c {
return Some(&self.bytes[start..(self.offset - 1)]); b'\'' => {
return Some(&self.bytes[start..(self.offset - 1)]);
}
_ => {}
}, },
ParserState::DoubleQuote => if c == b'"' { ParserState::DoubleQuote => match c {
return Some(&self.bytes[start..(self.offset - 1)]); b'"' => {
return Some(&self.bytes[start..(self.offset - 1)]);
}
_ => {}
}, },
} }
} }

@ -1 +1 @@
Subproject commit bdefeef09803df45bdf6dae7f3ae289e58427e3a Subproject commit b40962a61c748756d7da293d9fff26aca019603e

View file

@ -6,11 +6,11 @@ edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
ansi_term = { workspace = true } ansi_term = "0.12.1"
color-eyre = { workspace = true } color-eyre = "0.6.2"
serde = { workspace = true } serde = "1.0.152"
steamlocate = { workspace = true } steamlocate = "2.0.0-beta.2"
time = { workspace = true } time = { version = "0.3.19", features = ["formatting", "local-offset", "macros"] }
tracing = { workspace = true } tracing = "0.1.37"
tracing-error = { workspace = true } tracing-error = "0.2.0"
tracing-subscriber = { workspace = true } tracing-subscriber = "0.3.16"

View file

@ -1,4 +1,3 @@
use std::collections::HashMap;
use std::path::PathBuf; use std::path::PathBuf;
use color_eyre::eyre::{OptionExt as _, WrapErr as _}; use color_eyre::eyre::{OptionExt as _, WrapErr as _};
@ -68,8 +67,6 @@ pub struct ModConfig {
pub depends: Vec<ModDependency>, pub depends: Vec<ModDependency>,
#[serde(default = "default_true", skip_serializing_if = "is_true")] #[serde(default = "default_true", skip_serializing_if = "is_true")]
pub bundled: bool, pub bundled: bool,
#[serde(default)]
pub name_overrides: HashMap<String, String>,
} }
pub const STEAMAPP_ID: u32 = 1361210; pub const STEAMAPP_ID: u32 = 1361210;
@ -87,7 +84,7 @@ pub fn collect_game_info() -> Result<Option<GameInfo>> {
.find_app(STEAMAPP_ID) .find_app(STEAMAPP_ID)
.wrap_err("Failed to look up game by Steam app ID")?; .wrap_err("Failed to look up game by Steam app ID")?;
let Some((app, library)) = found else { let Some((app, _)) = found else {
return Ok(None); return Ok(None);
}; };
@ -96,7 +93,7 @@ pub fn collect_game_info() -> Result<Option<GameInfo>> {
.ok_or_eyre("Missing field 'last_updated'")?; .ok_or_eyre("Missing field 'last_updated'")?;
Ok(Some(GameInfo { Ok(Some(GameInfo {
path: library.path().join(app.install_dir), path: app.install_dir.into(),
last_updated: last_updated.into(), last_updated: last_updated.into(),
})) }))
} }

View file

@ -19,7 +19,7 @@ pub const TIME_FORMAT: &[FormatItem] = format_description!("[hour]:[minute]:[sec
pub fn format_fields(w: &mut Writer<'_>, field: &Field, val: &dyn std::fmt::Debug) -> Result { pub fn format_fields(w: &mut Writer<'_>, field: &Field, val: &dyn std::fmt::Debug) -> Result {
if field.name() == "message" { if field.name() == "message" {
write!(w, "{val:?}") write!(w, "{:?}", val)
} else { } else {
Ok(()) Ok(())
} }
@ -70,7 +70,7 @@ where
writer, writer,
"[{}] [{:>5}] ", "[{}] [{:>5}] ",
time, time,
color.bold().paint(format!("{level}")) color.bold().paint(format!("{}", level))
)?; )?;
ctx.field_format().format_fields(writer.by_ref(), event)?; ctx.field_format().format_fields(writer.by_ref(), event)?;
@ -84,7 +84,7 @@ pub fn create_tracing_subscriber() {
EnvFilter::try_from_default_env().unwrap_or_else(|_| EnvFilter::try_new("info").unwrap()); EnvFilter::try_from_default_env().unwrap_or_else(|_| EnvFilter::try_new("info").unwrap());
let (dev_stdout_layer, prod_stdout_layer, filter_layer) = if cfg!(debug_assertions) { let (dev_stdout_layer, prod_stdout_layer, filter_layer) = if cfg!(debug_assertions) {
let fmt_layer = fmt::layer().pretty().with_writer(std::io::stderr); let fmt_layer = fmt::layer().pretty();
(Some(fmt_layer), None, None) (Some(fmt_layer), None, None)
} else { } else {
// Creates a layer that // Creates a layer that
@ -93,7 +93,6 @@ pub fn create_tracing_subscriber() {
// - does not print spans/targets // - does not print spans/targets
// - only prints time, not date // - only prints time, not date
let fmt_layer = fmt::layer() let fmt_layer = fmt::layer()
.with_writer(std::io::stderr)
.event_format(Formatter) .event_format(Formatter)
.fmt_fields(debug_fn(format_fields)); .fmt_fields(debug_fn(format_fields));

1
lib/luajit2-sys Submodule

@ -0,0 +1 @@
Subproject commit 5d1a075742395f767c79d9c0d7466c6fb442f106

View file

@ -1,20 +0,0 @@
[package]
name = "luajit2-sys"
version = "0.0.2"
description = "LuaJIT-2.1 FFI Bindings"
authors = ["Aaron Loucks <aloucks@cofront.net>"]
edition = "2021"
keywords = ["lua", "luajit", "script"]
license = "MIT OR Apache-2.0"
readme = "README.md"
repository = "https://github.com/aloucks/luajit2-sys"
documentation = "https://docs.rs/luajit2-sys"
links = "luajit"
[dependencies]
libc = { workspace = true }
[build-dependencies]
bindgen = { workspace = true }
cc = { workspace = true }
fs_extra = { workspace = true }

View file

@ -1,201 +0,0 @@
Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/LICENSE-2.0
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View file

@ -1,23 +0,0 @@
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

View file

@ -1,217 +0,0 @@
use cc::Build;
use fs_extra::dir;
use fs_extra::dir::CopyOptions;
use std::env;
use std::path::PathBuf;
use std::process::{Command, Stdio};
const LIB_NAME: &str = "luajit";
const LUAJIT_HEADERS: [&str; 4] = ["lua.h", "lualib.h", "lauxlib.h", "luajit.h"];
const LUAJIT_SRC: [&str; 65] = [
// LJCORE_O
// The MSVC toolchain cannot compile this assembler file,
// as it contains GNU-specific directives
// "lj_vm.S",
"lj_gc.c",
"lj_err.c",
"lj_char.c",
"lj_bc.c",
"lj_obj.c",
"lj_buf.c",
"lj_str.c",
"lj_tab.c",
"lj_func.c",
"lj_udata.c",
"lj_meta.c",
"lj_debug.c",
"lj_state.c",
"lj_dispatch.c",
"lj_vmevent.c",
"lj_vmmath.c",
"lj_strscan.c",
"lj_strfmt.c",
"lj_strfmt_num.c",
"lj_api.c",
"lj_profile.c",
"lj_lex.c",
"lj_parse.c",
"lj_bcread.c",
"lj_bcwrite.c",
"lj_load.c",
"lj_ir.c",
"lj_opt_mem.c",
"lj_opt_fold.c",
"lj_opt_narrow.c",
"lj_opt_dce.c",
"lj_opt_loop.c",
"lj_opt_split.c",
"lj_opt_sink.c",
"lj_mcode.c",
"lj_snap.c",
"lj_record.c",
"lj_crecord.c",
"lj_ffrecord.c",
"lj_asm.c",
"lj_trace.c",
"lj_gdbjit.c",
"lj_ctype.c",
"lj_cdata.c",
"lj_cconv.c",
"lj_ccall.c",
"lj_ccallback.c",
"lj_carith.c",
"lj_clib.c",
"lj_cparse.c",
"lj_lib.c",
"lj_alloc.c",
// LJLIB_O
"lib_aux.c",
"lib_base.c",
"lib_math.c",
"lib_bit.c",
"lib_string.c",
"lib_table.c",
"lib_io.c",
"lib_os.c",
"lib_package.c",
"lib_debug.c",
"lib_jit.c",
"lib_ffi.c",
"lib_init.c",
];
fn build_gcc(src_dir: &str) {
let mut buildcmd = Command::new("make");
if let Ok(flags) = env::var("CARGO_MAKEFLAGS") {
buildcmd.env("MAKEFLAGS", flags);
} else {
buildcmd.arg("-j8");
}
buildcmd.current_dir(src_dir);
buildcmd.stderr(Stdio::inherit());
buildcmd.arg("--no-silent");
// We do need to cross-compile even here, so that `lj_vm.o` is created
// for the correct architecture.
if env::var("CARGO_CFG_WINDOWS").is_ok() {
buildcmd.arg("TARGET_SYS=Windows");
buildcmd.arg("CROSS=x86_64-w64-mingw32-");
}
if cfg!(target_pointer_width = "32") {
buildcmd.arg("HOST_CC='gcc -m32'");
buildcmd.arg("-e");
} else {
buildcmd.arg("HOST_CC='gcc'");
}
let mut child = buildcmd.spawn().expect("failed to run make");
child
.wait()
.map(|status| status.success())
.expect("Failed to build LuaJIT");
}
fn build_msvc(src_dir: &str, out_dir: &str) {
let mut cc = Build::new();
// cc can't handle many of the `clang-dl`-specific flags, so
// we need to port them manually from a `make -n` run.
cc.out_dir(out_dir)
// `llvm-as` (which the clang-based toolchain for MSVC would use to compile `lj_vm.S`
// assembler) doesn't support some of the GNU-specific directives.
// However, the previous host-targeted compilation already created the
// object, so we simply link that.
.object(format!("{src_dir}/lj_vm.o"))
.define("_FILE_OFFSET_BITS", "64")
.define("_LARGEFILE_SOURCE", None)
.define("LUA_MULTILIB", "\"lib\"")
.define("LUAJIT_UNWIND_EXTERNAL", None)
.flag("-fcolor-diagnostics")
// Disable warnings
.flag("/W0")
.flag("/U _FORTIFY_SOURCE")
// Link statically
.flag("/MT")
// Omit frame pointers
.flag("/Oy");
for f in LUAJIT_SRC {
cc.file(format!("{src_dir}/{f}"));
}
cc.compile(LIB_NAME);
}
fn main() {
let luajit_dir = format!("{}/luajit", env!("CARGO_MANIFEST_DIR"));
let out_dir = env::var("OUT_DIR").unwrap();
let src_dir = format!("{out_dir}/luajit/src");
dbg!(&luajit_dir);
dbg!(&out_dir);
dbg!(&src_dir);
let mut copy_options = CopyOptions::new();
copy_options.overwrite = true;
dir::copy(&luajit_dir, &out_dir, &copy_options).expect("Failed to copy LuaJIT source");
// The first run builds with and for the host architecture.
// This also creates all the tools and generated sources that a compilation needs.
build_gcc(&src_dir);
// Then, for cross-compilation, we can utilize those generated
// sources to re-compile just the library.
if env::var("CARGO_CFG_WINDOWS").is_ok() {
build_msvc(&src_dir, &out_dir);
println!("cargo:rustc-link-search={out_dir}");
} else {
println!("cargo:rustc-link-search=native={src_dir}");
}
println!("cargo:lib-name={LIB_NAME}");
println!("cargo:include={src_dir}");
println!("cargo:rustc-link-lib=static={LIB_NAME}");
let mut bindings = bindgen::Builder::default();
for header in LUAJIT_HEADERS {
println!("cargo:rerun-if-changed={luajit_dir}/src/{header}");
bindings = bindings.header(format!("{luajit_dir}/src/{header}"));
}
let bindings = bindings
.allowlist_var("LUA.*")
.allowlist_var("LUAJIT.*")
.allowlist_type("lua_.*")
.allowlist_type("luaL_.*")
.allowlist_function("lua_.*")
.allowlist_function("luaL_.*")
.allowlist_function("luaJIT.*")
.ctypes_prefix("libc")
.impl_debug(true)
.use_core()
.detect_include_paths(true)
.formatter(bindgen::Formatter::Rustfmt)
.sort_semantically(true)
.merge_extern_blocks(true)
.parse_callbacks(Box::new(bindgen::CargoCallbacks::new()));
let bindings = if env::var("CARGO_CFG_WINDOWS").is_ok() {
bindings
.clang_arg("-I/xwin/sdk/include/ucrt")
.clang_arg("-I/xwin/sdk/include/um")
.clang_arg("-I/xwin/sdk/include/shared")
.clang_arg("-I/xwin/crt/include")
.generate()
.expect("Failed to generate bindings")
} else {
bindings.generate().expect("Failed to generate bindings")
};
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings
.write_to_file(out_path.join("bindings.rs"))
.expect("Failed to write bindings");
}

@ -1 +0,0 @@
Subproject commit 70f4b15ee45a6137fe6b48b941faea79d72f7159

View file

@ -1,167 +0,0 @@
#![no_std]
#![allow(non_snake_case)]
#![allow(non_camel_case_types)]
#![allow(clippy::deprecated_semver)]
#![allow(clippy::missing_safety_doc)]
//! # LuaJIT 2.1
//!
//! <http://luajit.org>
//!
//! <http://www.lua.org/manual/5.1/manual.html>
//!
//! ## Performance considerations
//!
//! The _Not Yet Implemented_ guide documents which language features will be JIT compiled
//! into native machine code.
//!
//! <http://wiki.luajit.org/NYI>
mod ffi {
include!(concat!(env!("OUT_DIR"), "/bindings.rs"));
}
pub use ffi::*;
use core::ptr;
// These are defined as macros
/// <https://www.lua.org/manual/5.1/manual.html#lua_pop>
#[inline]
pub unsafe fn lua_pop(L: *mut lua_State, idx: libc::c_int) {
lua_settop(L, -(idx) - 1)
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_newtable>
#[inline]
pub unsafe fn lua_newtable(L: *mut lua_State) {
lua_createtable(L, 0, 0)
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_register>
#[inline]
pub unsafe fn lua_register(L: *mut lua_State, name: *const libc::c_char, f: lua_CFunction) {
lua_pushcfunction(L, f);
lua_setglobal(L, name);
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_pushcfunction>
#[inline]
pub unsafe fn lua_pushcfunction(L: *mut lua_State, f: lua_CFunction) {
lua_pushcclosure(L, f, 0);
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_strlen>
#[inline]
pub unsafe fn lua_strlen(L: *mut lua_State, idx: libc::c_int) -> usize {
lua_objlen(L, idx)
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isfunction>
#[inline]
pub unsafe fn lua_isfunction(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TFUNCTION as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_istable>
#[inline]
pub unsafe fn lua_istable(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TTABLE as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_islightuserdata>
#[inline]
pub unsafe fn lua_islightuserdata(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TLIGHTUSERDATA as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isnil>
#[inline]
pub unsafe fn lua_isnil(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TNIL as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isboolean>
#[inline]
pub unsafe fn lua_isboolean(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TBOOLEAN as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isthread>
#[inline]
pub unsafe fn lua_isthread(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TTHREAD as i32) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isnone>
#[inline]
pub unsafe fn lua_isnone(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) == LUA_TNONE) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_isnoneornil>
#[inline]
pub unsafe fn lua_isnoneornil(L: *mut lua_State, idx: libc::c_int) -> libc::c_int {
(lua_type(L, idx) <= 0) as i32
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_pushliteral>
#[inline]
pub unsafe fn lua_pushliteral(L: *mut lua_State, s: &str) {
lua_pushlstring(L, s.as_ptr() as _, s.len() as _);
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_setglobal>
#[inline]
pub unsafe fn lua_setglobal(L: *mut lua_State, k: *const libc::c_char) {
lua_setfield(L, LUA_GLOBALSINDEX, k);
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_getglobal>
#[inline]
pub unsafe fn lua_getglobal(L: *mut lua_State, k: *const libc::c_char) {
lua_getfield(L, LUA_GLOBALSINDEX, k)
}
/// <https://www.lua.org/manual/5.1/manual.html#lua_tostring>
#[inline]
pub unsafe fn lua_tostring(L: *mut lua_State, idx: libc::c_int) -> *const libc::c_char {
lua_tolstring(L, idx, ptr::null_mut())
}
// Additional compatibility items that are defined as macros
/// `luaL_newstate()`
#[inline]
#[deprecated(since = "Lua 5.1", note = "replace with `luaL_newstate()`")]
pub unsafe fn lua_open() -> *mut lua_State {
luaL_newstate()
}
/// `lua_pushvalue(L, LUA_REGISTRYINDEX)`
#[inline]
#[deprecated(
since = "Lua 5.1",
note = "replace with `lua_pushvalue(L, LUA_REGISTRYINDEX)`"
)]
pub unsafe fn lua_getregistry(L: *mut lua_State) {
lua_pushvalue(L, LUA_REGISTRYINDEX)
}
/// `lua_gc(L, LUA_GCCOUNT as _, 0)`
#[inline]
#[deprecated(
since = "Lua 5.1",
note = "replace with `lua_gc(L, LUA_GCCOUNT as _, 0)`"
)]
pub unsafe fn lua_getgccount(L: *mut lua_State) -> libc::c_int {
lua_gc(L, LUA_GCCOUNT as _, 0)
}
/// `lua_Reader`
#[deprecated(since = "Lua 5.1", note = "replace with `lua_Reader`")]
pub type lua_Chunkreader = lua_Reader;
/// `lua_Writer`
#[deprecated(since = "Lua 5.1", note = "replace with `lua_Writer`")]
pub type lua_Chunkwriter = lua_Writer;

View file

@ -12,7 +12,7 @@ regex = "1.7.1"
reqwest = { version = "0.12.4" } reqwest = { version = "0.12.4" }
serde = { version = "1.0.152", features = ["derive"] } serde = { version = "1.0.152", features = ["derive"] }
serde_json = "1.0.94" serde_json = "1.0.94"
thiserror = "2.0.0" thiserror = "1.0.39"
time = { version = "0.3.20", features = ["serde"] } time = { version = "0.3.20", features = ["serde"] }
tracing = "0.1.37" tracing = "0.1.37"
url = { version = "2.3.1", features = ["serde"] } url = { version = "2.3.1", features = ["serde"] }

View file

@ -28,7 +28,7 @@ pub enum Error {
HTTP(#[from] reqwest::Error), HTTP(#[from] reqwest::Error),
#[error("invalid URL: {0:?}")] #[error("invalid URL: {0:?}")]
URLParseError(#[from] url::ParseError), URLParseError(#[from] url::ParseError),
#[error("failed to deserialize due to {error}: {json}")] #[error("failed to deserialize '{error}': {json}")]
Deserialize { Deserialize {
json: String, json: String,
error: serde_json::Error, error: serde_json::Error,
@ -37,7 +37,7 @@ pub enum Error {
InvalidHeaderValue(#[from] InvalidHeaderValue), InvalidHeaderValue(#[from] InvalidHeaderValue),
#[error("this error cannot happen")] #[error("this error cannot happen")]
Infallible(#[from] Infallible), Infallible(#[from] Infallible),
#[error("invalid NXM URL '{url}': {0}", url = .1.as_str())] #[error("invalid NXM URL '{}': {0}", .1.as_str())]
InvalidNXM(&'static str, Url), InvalidNXM(&'static str, Url),
#[error("{0}")] #[error("{0}")]
Custom(String), Custom(String),
@ -99,7 +99,7 @@ impl Api {
#[tracing::instrument(skip(self))] #[tracing::instrument(skip(self))]
pub async fn mods_id(&self, id: u64) -> Result<Mod> { pub async fn mods_id(&self, id: u64) -> Result<Mod> {
let url = BASE_URL_GAME.join(&format!("mods/{id}.json"))?; let url = BASE_URL_GAME.join(&format!("mods/{}.json", id))?;
let req = self.client.get(url); let req = self.client.get(url);
self.send(req).await self.send(req).await
} }

View file

@ -6,8 +6,8 @@ edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
color-eyre = { workspace = true } color-eyre = "0.6.2"
tracing = { workspace = true } tracing = "0.1.37"
[build-dependencies] [build-dependencies]
bindgen = "0.72.0" bindgen = "0.69.4"

View file

@ -11,7 +11,7 @@ fn main() {
} else { } else {
"oo2core_win64" "oo2core_win64"
}; };
println!("cargo:rustc-link-lib=static={lib_name}"); println!("cargo:rustc-link-lib=static={}", lib_name);
} else { } else {
println!("cargo:rustc-link-lib=static=oo2corelinux64"); println!("cargo:rustc-link-lib=static=oo2corelinux64");
println!("cargo:rustc-link-lib=stdc++"); println!("cargo:rustc-link-lib=stdc++");

View file

@ -7,7 +7,6 @@ use std::ptr;
use color_eyre::{eyre, Result}; use color_eyre::{eyre, Result};
#[allow(dead_code)] #[allow(dead_code)]
#[allow(clippy::identity_op)]
mod bindings { mod bindings {
include!(concat!(env!("OUT_DIR"), "/bindings.rs")); include!(concat!(env!("OUT_DIR"), "/bindings.rs"));
} }
@ -52,7 +51,6 @@ impl From<OodleLZ_CheckCRC> for bindings::OodleLZ_CheckCRC {
#[tracing::instrument(skip(data))] #[tracing::instrument(skip(data))]
pub fn decompress<I>( pub fn decompress<I>(
data: I, data: I,
out_size: usize,
fuzz_safe: OodleLZ_FuzzSafe, fuzz_safe: OodleLZ_FuzzSafe,
check_crc: OodleLZ_CheckCRC, check_crc: OodleLZ_CheckCRC,
) -> Result<Vec<u8>> ) -> Result<Vec<u8>>
@ -60,7 +58,7 @@ where
I: AsRef<[u8]>, I: AsRef<[u8]>,
{ {
let data = data.as_ref(); let data = data.as_ref();
let mut out = vec![0; out_size]; let mut out = vec![0; CHUNK_SIZE];
let verbosity = if tracing::enabled!(tracing::Level::INFO) { let verbosity = if tracing::enabled!(tracing::Level::INFO) {
bindings::OodleLZ_Verbosity_OodleLZ_Verbosity_Minimal bindings::OodleLZ_Verbosity_OodleLZ_Verbosity_Minimal

View file

@ -4,23 +4,23 @@ version = "0.3.0"
edition = "2021" edition = "2021"
[dependencies] [dependencies]
async-recursion = { workspace = true } bitflags = "2.5.0"
bitflags = { workspace = true } byteorder = "1.4.3"
byteorder = { workspace = true } color-eyre = "0.6.2"
color-eyre = { workspace = true } csv-async = { version = "1.2.4", features = ["tokio", "serde"] }
csv-async = { workspace = true } fastrand = "2.1.0"
fastrand = { workspace = true } futures = "0.3.25"
futures = { workspace = true } futures-util = "0.3.24"
futures-util = { workspace = true } glob = "0.3.0"
glob = { workspace = true } nanorand = "0.7.0"
luajit2-sys = { workspace = true } pin-project-lite = "0.2.9"
nanorand = { workspace = true } serde = { version = "1.0.147", features = ["derive"] }
oodle = { workspace = true } serde_sjson = { path = "../../lib/serde_sjson", version = "*" }
path-slash = { workspace = true } oodle = { path = "../../lib/oodle", version = "*" }
pin-project-lite = { workspace = true } tokio = { version = "1.21.2", features = ["rt-multi-thread", "fs", "process", "macros", "tracing", "io-util", "io-std"] }
serde = { workspace = true } tokio-stream = { version = "0.1.11", features = ["fs", "io-util"] }
serde_sjson = { workspace = true } tracing = { version = "0.1.37", features = ["async-await"] }
tokio = { workspace = true } tracing-error = "0.2.0"
tokio-stream = { workspace = true } luajit2-sys = { path = "../../lib/luajit2-sys", version = "*" }
tracing = { workspace = true } async-recursion = "1.0.2"
tracing-error = { workspace = true } path-slash = "0.2.1"

View file

@ -43,11 +43,10 @@ impl<T: FromBinary> FromBinary for Vec<T> {
} }
pub mod sync { pub mod sync {
use std::ffi::CStr; use std::io::{self, Read, Seek, SeekFrom};
use std::io::{self, Read, Seek, SeekFrom, Write};
use byteorder::{LittleEndian, ReadBytesExt, WriteBytesExt}; use byteorder::{LittleEndian, ReadBytesExt, WriteBytesExt};
use color_eyre::eyre::{self, WrapErr}; use color_eyre::eyre::WrapErr;
use color_eyre::{Help, Report, Result, SectionExt}; use color_eyre::{Help, Report, Result, SectionExt};
macro_rules! make_read { macro_rules! make_read {
@ -123,7 +122,7 @@ pub mod sync {
}; };
} }
pub trait ReadExt: Read + Seek { pub trait ReadExt: ReadBytesExt + Seek {
fn read_u8(&mut self) -> io::Result<u8> { fn read_u8(&mut self) -> io::Result<u8> {
ReadBytesExt::read_u8(self) ReadBytesExt::read_u8(self)
} }
@ -131,6 +130,7 @@ pub mod sync {
make_read!(read_u32, read_u32_le, u32); make_read!(read_u32, read_u32_le, u32);
make_read!(read_u64, read_u64_le, u64); make_read!(read_u64, read_u64_le, u64);
make_skip!(skip_u8, read_u8, u8);
make_skip!(skip_u32, read_u32, u32); make_skip!(skip_u32, read_u32, u32);
// Implementation based on https://en.wikipedia.com/wiki/LEB128 // Implementation based on https://en.wikipedia.com/wiki/LEB128
@ -165,13 +165,25 @@ pub mod sync {
} }
fn read_string_len(&mut self, len: usize) -> Result<String> { fn read_string_len(&mut self, len: usize) -> Result<String> {
let pos = self.stream_position(); let mut buf = vec![0; len];
let res = self
.read_exact(&mut buf)
.map_err(Report::new)
.and_then(|_| {
String::from_utf8(buf).map_err(|err| {
let ascii = String::from_utf8_lossy(err.as_bytes()).to_string();
let bytes = format!("{:?}", err.as_bytes());
Report::new(err)
.with_section(move || bytes.header("Bytes:"))
.with_section(move || ascii.header("ASCII:"))
})
});
let res = read_string_len(self, len);
if res.is_ok() { if res.is_ok() {
return res; return res;
} }
let pos = self.stream_position();
if pos.is_ok() { if pos.is_ok() {
res.with_section(|| { res.with_section(|| {
format!("{pos:#X} ({pos})", pos = pos.unwrap()).header("Position: ") format!("{pos:#X} ({pos})", pos = pos.unwrap()).header("Position: ")
@ -180,17 +192,9 @@ pub mod sync {
res res
} }
} }
fn read_bool(&mut self) -> Result<bool> {
match ReadExt::read_u8(self)? {
0 => Ok(false),
1 => Ok(true),
v => eyre::bail!("Invalid value for boolean '{}'", v),
}
}
} }
pub trait WriteExt: Write + Seek { pub trait WriteExt: WriteBytesExt + Seek {
fn write_u8(&mut self, val: u8) -> io::Result<()> { fn write_u8(&mut self, val: u8) -> io::Result<()> {
WriteBytesExt::write_u8(self, val) WriteBytesExt::write_u8(self, val)
} }
@ -198,10 +202,6 @@ pub mod sync {
make_write!(write_u32, write_u32_le, u32); make_write!(write_u32, write_u32_le, u32);
make_write!(write_u64, write_u64_le, u64); make_write!(write_u64, write_u64_le, u64);
fn write_bool(&mut self, val: bool) -> io::Result<()> {
WriteBytesExt::write_u8(self, if val { 1 } else { 0 })
}
fn write_padding(&mut self) -> io::Result<usize> { fn write_padding(&mut self) -> io::Result<usize> {
let pos = self.stream_position()?; let pos = self.stream_position()?;
let size = 16 - (pos % 16) as usize; let size = 16 - (pos % 16) as usize;
@ -218,8 +218,8 @@ pub mod sync {
} }
} }
impl<R: Read + Seek + ?Sized> ReadExt for R {} impl<R: ReadBytesExt + Seek + ?Sized> ReadExt for R {}
impl<W: Write + Seek + ?Sized> WriteExt for W {} impl<W: WriteBytesExt + Seek + ?Sized> WriteExt for W {}
pub(crate) fn _read_up_to<R>(r: &mut R, buf: &mut Vec<u8>) -> Result<usize> pub(crate) fn _read_up_to<R>(r: &mut R, buf: &mut Vec<u8>) -> Result<usize>
where where
@ -243,22 +243,4 @@ pub mod sync {
Err(err).with_section(|| format!("{pos:#X} ({pos})").header("Position: ")) Err(err).with_section(|| format!("{pos:#X} ({pos})").header("Position: "))
} }
fn read_string_len(mut r: impl Read, len: usize) -> Result<String> {
let mut buf = vec![0; len];
r.read_exact(&mut buf)
.wrap_err_with(|| format!("Failed to read {len} bytes"))?;
let res = match CStr::from_bytes_until_nul(&buf) {
Ok(s) => {
let s = s.to_str()?;
Ok(s.to_string())
}
Err(_) => String::from_utf8(buf.clone()).map_err(Report::new),
};
res.wrap_err("Invalid binary for UTF8 string")
.with_section(|| format!("{}", String::from_utf8_lossy(&buf)).header("ASCI:"))
.with_section(|| format!("{buf:x?}").header("Bytes:"))
}
} }

View file

@ -13,21 +13,21 @@ use crate::binary::ToBinary;
use crate::murmur::Murmur64; use crate::murmur::Murmur64;
use crate::Bundle; use crate::Bundle;
use super::filetype::BundleFileType; use super::file::BundleFileType;
const DATABASE_VERSION: u32 = 0x6; const DATABASE_VERSION: u32 = 0x6;
const FILE_VERSION: u32 = 0x4; const FILE_VERSION: u32 = 0x4;
pub struct BundleFile { pub struct BundleFile {
pub name: String, name: String,
pub stream: String, stream: String,
pub platform_specific: bool, platform_specific: bool,
pub file_time: u64, file_time: u64,
} }
pub struct FileName { pub struct FileName {
pub extension: BundleFileType, extension: BundleFileType,
pub name: Murmur64, name: Murmur64,
} }
pub struct BundleDatabase { pub struct BundleDatabase {
@ -36,34 +36,7 @@ pub struct BundleDatabase {
bundle_contents: HashMap<Murmur64, Vec<FileName>>, bundle_contents: HashMap<Murmur64, Vec<FileName>>,
} }
// Implements the partial Murmur that's used by the engine to compute bundle resource hashes,
// but in a way that the loop can be done outside the function.
#[inline(always)]
fn add_to_resource_hash(mut k: u64, name: impl Into<u64>) -> u64 {
const M: u64 = 0xc6a4a7935bd1e995;
const R: u64 = 47;
let mut h: u64 = name.into();
k = k.wrapping_mul(M);
k ^= k >> R;
k = k.wrapping_mul(M);
h ^= k;
k = M.wrapping_mul(h);
k
}
impl BundleDatabase { impl BundleDatabase {
pub fn bundles(&self) -> &HashMap<Murmur64, Vec<BundleFile>> {
&self.stored_files
}
pub fn files(&self) -> &HashMap<Murmur64, Vec<FileName>> {
&self.bundle_contents
}
pub fn add_bundle(&mut self, bundle: &Bundle) { pub fn add_bundle(&mut self, bundle: &Bundle) {
let hash = bundle.name().to_murmur64(); let hash = bundle.name().to_murmur64();
let name = hash.to_string(); let name = hash.to_string();
@ -96,26 +69,20 @@ impl BundleDatabase {
} }
} }
let mut resource_hash = 0;
for f in bundle.files() { for f in bundle.files() {
let name = f.base_name().to_murmur64();
let file_name = FileName { let file_name = FileName {
extension: f.file_type(), extension: f.file_type(),
name, name: f.base_name().to_murmur64(),
}; };
resource_hash = add_to_resource_hash(resource_hash, name); // TODO: Compute actual resource hash
self.resource_hashes.insert(hash, 0);
// TODO: Make sure each file name only exists once. Probably best to turn
// the `Vec` into a sorted `HashSet`.
self.bundle_contents self.bundle_contents
.entry(hash) .entry(hash)
.or_default() .or_default()
.push(file_name); .push(file_name);
} }
self.resource_hashes.insert(hash, resource_hash);
} }
} }
@ -136,7 +103,7 @@ impl FromBinary for BundleDatabase {
let mut stored_files = HashMap::with_capacity(num_entries); let mut stored_files = HashMap::with_capacity(num_entries);
for _ in 0..num_entries { for _ in 0..num_entries {
let hash = r.read_u64().map(Murmur64::from)?; let hash = Murmur64::from(r.read_u64()?);
let num_files = r.read_u32()? as usize; let num_files = r.read_u32()? as usize;
let mut files = Vec::with_capacity(num_files); let mut files = Vec::with_capacity(num_files);
@ -194,7 +161,7 @@ impl FromBinary for BundleDatabase {
let mut resource_hashes = HashMap::with_capacity(num_hashes); let mut resource_hashes = HashMap::with_capacity(num_hashes);
for _ in 0..num_hashes { for _ in 0..num_hashes {
let name = r.read_u64().map(Murmur64::from)?; let name = Murmur64::from(r.read_u64()?);
let hash = r.read_u64()?; let hash = r.read_u64()?;
resource_hashes.insert(name, hash); resource_hashes.insert(name, hash);
@ -204,14 +171,14 @@ impl FromBinary for BundleDatabase {
let mut bundle_contents = HashMap::with_capacity(num_contents); let mut bundle_contents = HashMap::with_capacity(num_contents);
for _ in 0..num_contents { for _ in 0..num_contents {
let hash = r.read_u64().map(Murmur64::from)?; let hash = Murmur64::from(r.read_u64()?);
let num_files = r.read_u32()? as usize; let num_files = r.read_u32()? as usize;
let mut files = Vec::with_capacity(num_files); let mut files = Vec::with_capacity(num_files);
for _ in 0..num_files { for _ in 0..num_files {
let extension = r.read_u64().map(BundleFileType::from)?; let extension = BundleFileType::from(r.read_u64()?);
let name = r.read_u64().map(Murmur64::from)?; let name = Murmur64::from(r.read_u64()?);
files.push(FileName { extension, name }); files.push(FileName { extension, name });
} }

View file

@ -5,28 +5,421 @@ use bitflags::bitflags;
use color_eyre::eyre::Context; use color_eyre::eyre::Context;
use color_eyre::{eyre, Result}; use color_eyre::{eyre, Result};
use futures::future::join_all; use futures::future::join_all;
use serde::Serialize;
use crate::binary::sync::*; use crate::binary::sync::*;
use crate::filetype::*; use crate::filetype::*;
use crate::murmur::{HashGroup, IdString64, Murmur64}; use crate::murmur::{HashGroup, IdString64, Murmur64};
use super::filetype::BundleFileType; #[derive(Debug, Hash, PartialEq, Eq, Copy, Clone)]
pub enum BundleFileType {
Animation,
AnimationCurves,
Apb,
BakedLighting,
Bik,
BlendSet,
Bones,
Chroma,
CommonPackage,
Config,
Crypto,
Data,
Entity,
Flow,
Font,
Ies,
Ini,
Input,
Ivf,
Keys,
Level,
Lua,
Material,
Mod,
MouseCursor,
NavData,
NetworkConfig,
OddleNet,
Package,
Particles,
PhysicsProperties,
RenderConfig,
RtPipeline,
Scene,
Shader,
ShaderLibrary,
ShaderLibraryGroup,
ShadingEnvionmentMapping,
ShadingEnvironment,
Slug,
SlugAlbum,
SoundEnvironment,
SpuJob,
StateMachine,
StaticPVS,
Strings,
SurfaceProperties,
Texture,
TimpaniBank,
TimpaniMaster,
Tome,
Ugg,
Unit,
Upb,
VectorField,
Wav,
WwiseBank,
WwiseDep,
WwiseEvent,
WwiseMetadata,
WwiseStream,
Xml,
Unknown(Murmur64),
}
impl BundleFileType {
pub fn ext_name(&self) -> String {
match self {
BundleFileType::AnimationCurves => String::from("animation_curves"),
BundleFileType::Animation => String::from("animation"),
BundleFileType::Apb => String::from("apb"),
BundleFileType::BakedLighting => String::from("baked_lighting"),
BundleFileType::Bik => String::from("bik"),
BundleFileType::BlendSet => String::from("blend_set"),
BundleFileType::Bones => String::from("bones"),
BundleFileType::Chroma => String::from("chroma"),
BundleFileType::CommonPackage => String::from("common_package"),
BundleFileType::Config => String::from("config"),
BundleFileType::Crypto => String::from("crypto"),
BundleFileType::Data => String::from("data"),
BundleFileType::Entity => String::from("entity"),
BundleFileType::Flow => String::from("flow"),
BundleFileType::Font => String::from("font"),
BundleFileType::Ies => String::from("ies"),
BundleFileType::Ini => String::from("ini"),
BundleFileType::Input => String::from("input"),
BundleFileType::Ivf => String::from("ivf"),
BundleFileType::Keys => String::from("keys"),
BundleFileType::Level => String::from("level"),
BundleFileType::Lua => String::from("lua"),
BundleFileType::Material => String::from("material"),
BundleFileType::Mod => String::from("mod"),
BundleFileType::MouseCursor => String::from("mouse_cursor"),
BundleFileType::NavData => String::from("nav_data"),
BundleFileType::NetworkConfig => String::from("network_config"),
BundleFileType::OddleNet => String::from("oodle_net"),
BundleFileType::Package => String::from("package"),
BundleFileType::Particles => String::from("particles"),
BundleFileType::PhysicsProperties => String::from("physics_properties"),
BundleFileType::RenderConfig => String::from("render_config"),
BundleFileType::RtPipeline => String::from("rt_pipeline"),
BundleFileType::Scene => String::from("scene"),
BundleFileType::ShaderLibraryGroup => String::from("shader_library_group"),
BundleFileType::ShaderLibrary => String::from("shader_library"),
BundleFileType::Shader => String::from("shader"),
BundleFileType::ShadingEnvionmentMapping => String::from("shading_environment_mapping"),
BundleFileType::ShadingEnvironment => String::from("shading_environment"),
BundleFileType::SlugAlbum => String::from("slug_album"),
BundleFileType::Slug => String::from("slug"),
BundleFileType::SoundEnvironment => String::from("sound_environment"),
BundleFileType::SpuJob => String::from("spu_job"),
BundleFileType::StateMachine => String::from("state_machine"),
BundleFileType::StaticPVS => String::from("static_pvs"),
BundleFileType::Strings => String::from("strings"),
BundleFileType::SurfaceProperties => String::from("surface_properties"),
BundleFileType::Texture => String::from("texture"),
BundleFileType::TimpaniBank => String::from("timpani_bank"),
BundleFileType::TimpaniMaster => String::from("timpani_master"),
BundleFileType::Tome => String::from("tome"),
BundleFileType::Ugg => String::from("ugg"),
BundleFileType::Unit => String::from("unit"),
BundleFileType::Upb => String::from("upb"),
BundleFileType::VectorField => String::from("vector_field"),
BundleFileType::Wav => String::from("wav"),
BundleFileType::WwiseBank => String::from("wwise_bank"),
BundleFileType::WwiseDep => String::from("wwise_dep"),
BundleFileType::WwiseEvent => String::from("wwise_event"),
BundleFileType::WwiseMetadata => String::from("wwise_metadata"),
BundleFileType::WwiseStream => String::from("wwise_stream"),
BundleFileType::Xml => String::from("xml"),
BundleFileType::Unknown(s) => format!("{s:016X}"),
}
}
pub fn decompiled_ext_name(&self) -> String {
match self {
BundleFileType::Texture => String::from("dds"),
BundleFileType::WwiseBank => String::from("bnk"),
BundleFileType::WwiseStream => String::from("ogg"),
_ => self.ext_name(),
}
}
pub fn hash(&self) -> Murmur64 {
Murmur64::from(*self)
}
}
impl std::str::FromStr for BundleFileType {
type Err = color_eyre::Report;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let val = match s {
"animation_curves" => BundleFileType::AnimationCurves,
"animation" => BundleFileType::Animation,
"apb" => BundleFileType::Apb,
"baked_lighting" => BundleFileType::BakedLighting,
"bik" => BundleFileType::Bik,
"blend_set" => BundleFileType::BlendSet,
"bones" => BundleFileType::Bones,
"chroma" => BundleFileType::Chroma,
"common_package" => BundleFileType::CommonPackage,
"config" => BundleFileType::Config,
"crypto" => BundleFileType::Crypto,
"data" => BundleFileType::Data,
"entity" => BundleFileType::Entity,
"flow" => BundleFileType::Flow,
"font" => BundleFileType::Font,
"ies" => BundleFileType::Ies,
"ini" => BundleFileType::Ini,
"input" => BundleFileType::Input,
"ivf" => BundleFileType::Ivf,
"keys" => BundleFileType::Keys,
"level" => BundleFileType::Level,
"lua" => BundleFileType::Lua,
"material" => BundleFileType::Material,
"mod" => BundleFileType::Mod,
"mouse_cursor" => BundleFileType::MouseCursor,
"nav_data" => BundleFileType::NavData,
"network_config" => BundleFileType::NetworkConfig,
"oodle_net" => BundleFileType::OddleNet,
"package" => BundleFileType::Package,
"particles" => BundleFileType::Particles,
"physics_properties" => BundleFileType::PhysicsProperties,
"render_config" => BundleFileType::RenderConfig,
"rt_pipeline" => BundleFileType::RtPipeline,
"scene" => BundleFileType::Scene,
"shader_library_group" => BundleFileType::ShaderLibraryGroup,
"shader_library" => BundleFileType::ShaderLibrary,
"shader" => BundleFileType::Shader,
"shading_environment_mapping" => BundleFileType::ShadingEnvionmentMapping,
"shading_environment" => BundleFileType::ShadingEnvironment,
"slug_album" => BundleFileType::SlugAlbum,
"slug" => BundleFileType::Slug,
"sound_environment" => BundleFileType::SoundEnvironment,
"spu_job" => BundleFileType::SpuJob,
"state_machine" => BundleFileType::StateMachine,
"static_pvs" => BundleFileType::StaticPVS,
"strings" => BundleFileType::Strings,
"surface_properties" => BundleFileType::SurfaceProperties,
"texture" => BundleFileType::Texture,
"timpani_bank" => BundleFileType::TimpaniBank,
"timpani_master" => BundleFileType::TimpaniMaster,
"tome" => BundleFileType::Tome,
"ugg" => BundleFileType::Ugg,
"unit" => BundleFileType::Unit,
"upb" => BundleFileType::Upb,
"vector_field" => BundleFileType::VectorField,
"wav" => BundleFileType::Wav,
"wwise_bank" => BundleFileType::WwiseBank,
"wwise_dep" => BundleFileType::WwiseDep,
"wwise_event" => BundleFileType::WwiseEvent,
"wwise_metadata" => BundleFileType::WwiseMetadata,
"wwise_stream" => BundleFileType::WwiseStream,
"xml" => BundleFileType::Xml,
s => eyre::bail!("Unknown type string '{}'", s),
};
Ok(val)
}
}
impl Serialize for BundleFileType {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
let value = self.ext_name();
value.serialize(serializer)
}
}
impl From<Murmur64> for BundleFileType {
fn from(value: Murmur64) -> Self {
Self::from(Into::<u64>::into(value))
}
}
impl From<u64> for BundleFileType {
fn from(hash: u64) -> BundleFileType {
match hash {
0x931e336d7646cc26 => BundleFileType::Animation,
0xdcfb9e18fff13984 => BundleFileType::AnimationCurves,
0x3eed05ba83af5090 => BundleFileType::Apb,
0x7ffdb779b04e4ed1 => BundleFileType::BakedLighting,
0xaa5965f03029fa18 => BundleFileType::Bik,
0xe301e8af94e3b5a3 => BundleFileType::BlendSet,
0x18dead01056b72e9 => BundleFileType::Bones,
0xb7893adf7567506a => BundleFileType::Chroma,
0xfe9754bd19814a47 => BundleFileType::CommonPackage,
0x82645835e6b73232 => BundleFileType::Config,
0x69108ded1e3e634b => BundleFileType::Crypto,
0x8fd0d44d20650b68 => BundleFileType::Data,
0x9831ca893b0d087d => BundleFileType::Entity,
0x92d3ee038eeb610d => BundleFileType::Flow,
0x9efe0a916aae7880 => BundleFileType::Font,
0x8f7d5a2c0f967655 => BundleFileType::Ies,
0xd526a27da14f1dc5 => BundleFileType::Ini,
0x2bbcabe5074ade9e => BundleFileType::Input,
0xfa4a8e091a91201e => BundleFileType::Ivf,
0xa62f9297dc969e85 => BundleFileType::Keys,
0x2a690fd348fe9ac5 => BundleFileType::Level,
0xa14e8dfa2cd117e2 => BundleFileType::Lua,
0xeac0b497876adedf => BundleFileType::Material,
0x3fcdd69156a46417 => BundleFileType::Mod,
0xb277b11fe4a61d37 => BundleFileType::MouseCursor,
0x169de9566953d264 => BundleFileType::NavData,
0x3b1fa9e8f6bac374 => BundleFileType::NetworkConfig,
0xb0f2c12eb107f4d8 => BundleFileType::OddleNet,
0xad9c6d9ed1e5e77a => BundleFileType::Package,
0xa8193123526fad64 => BundleFileType::Particles,
0xbf21403a3ab0bbb1 => BundleFileType::PhysicsProperties,
0x27862fe24795319c => BundleFileType::RenderConfig,
0x9ca183c2d0e76dee => BundleFileType::RtPipeline,
0x9d0a795bfe818d19 => BundleFileType::Scene,
0xcce8d5b5f5ae333f => BundleFileType::Shader,
0xe5ee32a477239a93 => BundleFileType::ShaderLibrary,
0x9e5c3cc74575aeb5 => BundleFileType::ShaderLibraryGroup,
0x250e0a11ac8e26f8 => BundleFileType::ShadingEnvionmentMapping,
0xfe73c7dcff8a7ca5 => BundleFileType::ShadingEnvironment,
0xa27b4d04a9ba6f9e => BundleFileType::Slug,
0xe9fc9ea7042e5ec0 => BundleFileType::SlugAlbum,
0xd8b27864a97ffdd7 => BundleFileType::SoundEnvironment,
0xf97af9983c05b950 => BundleFileType::SpuJob,
0xa486d4045106165c => BundleFileType::StateMachine,
0xe3f0baa17d620321 => BundleFileType::StaticPVS,
0x0d972bab10b40fd3 => BundleFileType::Strings,
0xad2d3fa30d9ab394 => BundleFileType::SurfaceProperties,
0xcd4238c6a0c69e32 => BundleFileType::Texture,
0x99736be1fff739a4 => BundleFileType::TimpaniBank,
0x00a3e6c59a2b9c6c => BundleFileType::TimpaniMaster,
0x19c792357c99f49b => BundleFileType::Tome,
0x712d6e3dd1024c9c => BundleFileType::Ugg,
0xe0a48d0be9a7453f => BundleFileType::Unit,
0xa99510c6e86dd3c2 => BundleFileType::Upb,
0xf7505933166d6755 => BundleFileType::VectorField,
0x786f65c00a816b19 => BundleFileType::Wav,
0x535a7bd3e650d799 => BundleFileType::WwiseBank,
0xaf32095c82f2b070 => BundleFileType::WwiseDep,
0xaabdd317b58dfc8a => BundleFileType::WwiseEvent,
0xd50a8b7e1c82b110 => BundleFileType::WwiseMetadata,
0x504b55235d21440e => BundleFileType::WwiseStream,
0x76015845a6003765 => BundleFileType::Xml,
_ => BundleFileType::Unknown(Murmur64::from(hash)),
}
}
}
impl From<BundleFileType> for u64 {
fn from(t: BundleFileType) -> u64 {
match t {
BundleFileType::Animation => 0x931e336d7646cc26,
BundleFileType::AnimationCurves => 0xdcfb9e18fff13984,
BundleFileType::Apb => 0x3eed05ba83af5090,
BundleFileType::BakedLighting => 0x7ffdb779b04e4ed1,
BundleFileType::Bik => 0xaa5965f03029fa18,
BundleFileType::BlendSet => 0xe301e8af94e3b5a3,
BundleFileType::Bones => 0x18dead01056b72e9,
BundleFileType::Chroma => 0xb7893adf7567506a,
BundleFileType::CommonPackage => 0xfe9754bd19814a47,
BundleFileType::Config => 0x82645835e6b73232,
BundleFileType::Crypto => 0x69108ded1e3e634b,
BundleFileType::Data => 0x8fd0d44d20650b68,
BundleFileType::Entity => 0x9831ca893b0d087d,
BundleFileType::Flow => 0x92d3ee038eeb610d,
BundleFileType::Font => 0x9efe0a916aae7880,
BundleFileType::Ies => 0x8f7d5a2c0f967655,
BundleFileType::Ini => 0xd526a27da14f1dc5,
BundleFileType::Input => 0x2bbcabe5074ade9e,
BundleFileType::Ivf => 0xfa4a8e091a91201e,
BundleFileType::Keys => 0xa62f9297dc969e85,
BundleFileType::Level => 0x2a690fd348fe9ac5,
BundleFileType::Lua => 0xa14e8dfa2cd117e2,
BundleFileType::Material => 0xeac0b497876adedf,
BundleFileType::Mod => 0x3fcdd69156a46417,
BundleFileType::MouseCursor => 0xb277b11fe4a61d37,
BundleFileType::NavData => 0x169de9566953d264,
BundleFileType::NetworkConfig => 0x3b1fa9e8f6bac374,
BundleFileType::OddleNet => 0xb0f2c12eb107f4d8,
BundleFileType::Package => 0xad9c6d9ed1e5e77a,
BundleFileType::Particles => 0xa8193123526fad64,
BundleFileType::PhysicsProperties => 0xbf21403a3ab0bbb1,
BundleFileType::RenderConfig => 0x27862fe24795319c,
BundleFileType::RtPipeline => 0x9ca183c2d0e76dee,
BundleFileType::Scene => 0x9d0a795bfe818d19,
BundleFileType::Shader => 0xcce8d5b5f5ae333f,
BundleFileType::ShaderLibrary => 0xe5ee32a477239a93,
BundleFileType::ShaderLibraryGroup => 0x9e5c3cc74575aeb5,
BundleFileType::ShadingEnvionmentMapping => 0x250e0a11ac8e26f8,
BundleFileType::ShadingEnvironment => 0xfe73c7dcff8a7ca5,
BundleFileType::Slug => 0xa27b4d04a9ba6f9e,
BundleFileType::SlugAlbum => 0xe9fc9ea7042e5ec0,
BundleFileType::SoundEnvironment => 0xd8b27864a97ffdd7,
BundleFileType::SpuJob => 0xf97af9983c05b950,
BundleFileType::StateMachine => 0xa486d4045106165c,
BundleFileType::StaticPVS => 0xe3f0baa17d620321,
BundleFileType::Strings => 0x0d972bab10b40fd3,
BundleFileType::SurfaceProperties => 0xad2d3fa30d9ab394,
BundleFileType::Texture => 0xcd4238c6a0c69e32,
BundleFileType::TimpaniBank => 0x99736be1fff739a4,
BundleFileType::TimpaniMaster => 0x00a3e6c59a2b9c6c,
BundleFileType::Tome => 0x19c792357c99f49b,
BundleFileType::Ugg => 0x712d6e3dd1024c9c,
BundleFileType::Unit => 0xe0a48d0be9a7453f,
BundleFileType::Upb => 0xa99510c6e86dd3c2,
BundleFileType::VectorField => 0xf7505933166d6755,
BundleFileType::Wav => 0x786f65c00a816b19,
BundleFileType::WwiseBank => 0x535a7bd3e650d799,
BundleFileType::WwiseDep => 0xaf32095c82f2b070,
BundleFileType::WwiseEvent => 0xaabdd317b58dfc8a,
BundleFileType::WwiseMetadata => 0xd50a8b7e1c82b110,
BundleFileType::WwiseStream => 0x504b55235d21440e,
BundleFileType::Xml => 0x76015845a6003765,
BundleFileType::Unknown(hash) => hash.into(),
}
}
}
impl From<BundleFileType> for Murmur64 {
fn from(t: BundleFileType) -> Murmur64 {
let hash: u64 = t.into();
Murmur64::from(hash)
}
}
impl std::fmt::Display for BundleFileType {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.ext_name())
}
}
#[derive(Debug)] #[derive(Debug)]
struct BundleFileHeader { struct BundleFileHeader {
variant: u32, variant: u32,
external: bool,
size: usize,
unknown_1: u8, unknown_1: u8,
size: usize,
len_data_file_name: usize, len_data_file_name: usize,
} }
#[derive(Clone)]
pub struct BundleFileVariant { pub struct BundleFileVariant {
property: u32, property: u32,
data: Vec<u8>, data: Vec<u8>,
data_file_name: Option<String>, data_file_name: Option<String>,
external: bool, // Seems to be related to whether there is a data path.
unknown_1: u8, unknown_1: u8,
} }
@ -40,7 +433,6 @@ impl BundleFileVariant {
property: 0, property: 0,
data: Vec::new(), data: Vec::new(),
data_file_name: None, data_file_name: None,
external: false,
unknown_1: 0, unknown_1: 0,
} }
} }
@ -65,30 +457,21 @@ impl BundleFileVariant {
self.data_file_name.as_ref() self.data_file_name.as_ref()
} }
pub fn external(&self) -> bool {
self.external
}
pub fn unknown_1(&self) -> u8 {
self.unknown_1
}
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
fn read_header<R>(r: &mut R) -> Result<BundleFileHeader> fn read_header<R>(r: &mut R) -> Result<BundleFileHeader>
where where
R: Read + Seek, R: Read + Seek,
{ {
let variant = r.read_u32()?; let variant = r.read_u32()?;
let external = r.read_bool()?;
let size = r.read_u32()? as usize;
let unknown_1 = r.read_u8()?; let unknown_1 = r.read_u8()?;
let size = r.read_u32()? as usize;
r.skip_u8(1)?;
let len_data_file_name = r.read_u32()? as usize; let len_data_file_name = r.read_u32()? as usize;
Ok(BundleFileHeader { Ok(BundleFileHeader {
size, size,
external,
variant,
unknown_1, unknown_1,
variant,
len_data_file_name, len_data_file_name,
}) })
} }
@ -99,7 +482,7 @@ impl BundleFileVariant {
W: Write + Seek, W: Write + Seek,
{ {
w.write_u32(self.property)?; w.write_u32(self.property)?;
w.write_bool(self.external)?; w.write_u8(self.unknown_1)?;
let len_data_file_name = self.data_file_name.as_ref().map(|s| s.len()).unwrap_or(0); let len_data_file_name = self.data_file_name.as_ref().map(|s| s.len()).unwrap_or(0);
@ -117,36 +500,13 @@ impl BundleFileVariant {
} }
} }
impl std::fmt::Debug for BundleFileVariant {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut out = f.debug_struct("BundleFileVariant");
out.field("property", &self.property);
if self.data.len() <= 5 {
out.field("data", &format!("{:x?}", &self.data));
} else {
out.field(
"data",
&format!("{:x?}.. ({} bytes)", &self.data[..5], &self.data.len()),
);
}
out.field("data_file_name", &self.data_file_name)
.field("external", &self.external)
.finish()
}
}
bitflags! { bitflags! {
#[derive(Default, Clone, Copy, Debug)] #[derive(Default, Clone, Copy, Debug)]
pub struct Properties: u32 { pub struct Properties: u32 {
const DATA = 0b100; const DATA = 0b100;
// A custom flag used by DTMT to signify a file altered by mods.
const MODDED = 1 << 31;
} }
} }
#[derive(Clone, Debug)]
pub struct BundleFile { pub struct BundleFile {
file_type: BundleFileType, file_type: BundleFileType,
name: IdString64, name: IdString64,
@ -155,7 +515,7 @@ pub struct BundleFile {
} }
impl BundleFile { impl BundleFile {
pub fn new(name: impl Into<IdString64>, file_type: BundleFileType) -> Self { pub fn new(name: String, file_type: BundleFileType) -> Self {
Self { Self {
file_type, file_type,
name: name.into(), name: name.into(),
@ -168,18 +528,6 @@ impl BundleFile {
self.variants.push(variant) self.variants.push(variant)
} }
pub fn set_variants(&mut self, variants: Vec<BundleFileVariant>) {
self.variants = variants;
}
pub fn set_props(&mut self, props: Properties) {
self.props = props;
}
pub fn set_modded(&mut self, is_modded: bool) {
self.props.set(Properties::MODDED, is_modded);
}
#[tracing::instrument(name = "File::read", skip(ctx, r))] #[tracing::instrument(name = "File::read", skip(ctx, r))]
pub fn from_reader<R>(ctx: &crate::Context, r: &mut R, props: Properties) -> Result<Self> pub fn from_reader<R>(ctx: &crate::Context, r: &mut R, props: Properties) -> Result<Self>
where where
@ -235,7 +583,6 @@ impl BundleFile {
let s = r let s = r
.read_string_len(header.len_data_file_name) .read_string_len(header.len_data_file_name)
.wrap_err("Failed to read data file name")?; .wrap_err("Failed to read data file name")?;
Some(s) Some(s)
} else { } else {
None None
@ -248,7 +595,6 @@ impl BundleFile {
property: header.variant, property: header.variant,
data, data,
data_file_name, data_file_name,
external: header.external,
unknown_1: header.unknown_1, unknown_1: header.unknown_1,
}; };
@ -276,7 +622,7 @@ impl BundleFile {
for variant in self.variants.iter() { for variant in self.variants.iter() {
w.write_u32(variant.property())?; w.write_u32(variant.property())?;
w.write_bool(variant.external)?; w.write_u8(variant.unknown_1)?;
let len_data_file_name = variant.data_file_name().map(|s| s.len()).unwrap_or(0); let len_data_file_name = variant.data_file_name().map(|s| s.len()).unwrap_or(0);
@ -301,15 +647,20 @@ impl BundleFile {
Ok(w.into_inner()) Ok(w.into_inner())
} }
#[tracing::instrument("File::from_sjson", skip(sjson, name), fields(name = %name.display()))] #[tracing::instrument(name = "File::from_sjson", skip(sjson))]
pub async fn from_sjson( pub async fn from_sjson<P, S>(
name: IdString64, name: String,
file_type: BundleFileType, file_type: BundleFileType,
sjson: impl AsRef<str>, sjson: S,
root: impl AsRef<Path> + std::fmt::Debug, root: P,
) -> Result<Self> { ) -> Result<Self>
where
P: AsRef<Path> + std::fmt::Debug,
S: AsRef<str>,
{
match file_type { match file_type {
BundleFileType::Lua => lua::compile(name, sjson).wrap_err("Failed to compile Lua file"), BundleFileType::Lua => lua::compile(name.clone(), sjson)
.wrap_err_with(|| format!("Failed to compile Lua file '{}'", name)),
BundleFileType::Unknown(_) => { BundleFileType::Unknown(_) => {
eyre::bail!("Unknown file type. Cannot compile from SJSON"); eyre::bail!("Unknown file type. Cannot compile from SJSON");
} }
@ -348,13 +699,17 @@ impl BundleFile {
s s
} }
pub fn matches_name(&self, name: &IdString64) -> bool { pub fn matches_name<S>(&self, name: S) -> bool
if self.name == *name { where
S: Into<IdString64>,
{
let name = name.into();
if self.name == name {
return true; return true;
} }
if let IdString64::String(name) = name { if let IdString64::String(name) = name {
self.name(false, None) == *name || self.name(true, None) == *name self.name(false, None) == name || self.name(true, None) == name
} else { } else {
false false
} }
@ -392,16 +747,18 @@ impl BundleFile {
Ok(files) Ok(files)
} }
#[tracing::instrument( #[tracing::instrument(name = "File::decompiled", skip_all)]
name = "File::decompiled",
skip_all,
fields(file = self.name(false, None), file_type = self.file_type().ext_name(), variants = self.variants.len())
)]
pub async fn decompiled(&self, ctx: &crate::Context) -> Result<Vec<UserFile>> { pub async fn decompiled(&self, ctx: &crate::Context) -> Result<Vec<UserFile>> {
let file_type = self.file_type(); let file_type = self.file_type();
// The `Strings` type handles all variants combined. if tracing::enabled!(tracing::Level::DEBUG) {
// For the other ones, each variant will be its own file. tracing::debug!(
name = self.name(true, None),
variants = self.variants.len(),
"Attempting to decompile"
);
}
if file_type == BundleFileType::Strings { if file_type == BundleFileType::Strings {
return strings::decompile(ctx, &self.variants); return strings::decompile(ctx, &self.variants);
} }

View file

@ -1,175 +0,0 @@
use color_eyre::eyre;
use color_eyre::Result;
use serde::Serialize;
use crate::murmur::Murmur64;
macro_rules! make_enum {
(
$( $variant:ident, $hash:expr, $ext:expr $(, $decompiled:expr)? ; )+
) => {
#[derive(Debug, Hash, PartialEq, Eq, Copy, Clone)]
pub enum BundleFileType {
$(
$variant,
)+
Unknown(Murmur64),
}
impl BundleFileType {
pub fn ext_name(&self) -> String {
match self {
$(
Self::$variant => String::from($ext),
)+
Self::Unknown(s) => format!("{s:016X}"),
}
}
pub fn decompiled_ext_name(&self) -> String {
match self {
$(
$( Self::$variant => String::from($decompiled), )?
)+
_ => self.ext_name(),
}
}
}
impl std::str::FromStr for BundleFileType {
type Err = color_eyre::Report;
fn from_str(s: &str) -> Result<Self> {
match s {
$(
$ext => Ok(Self::$variant),
)+
s => eyre::bail!("Unknown type string '{}'", s),
}
}
}
impl From<u64> for BundleFileType {
fn from(h: u64) -> Self {
match h {
$(
$hash => Self::$variant,
)+
hash => Self::Unknown(hash.into()),
}
}
}
impl From<BundleFileType> for u64 {
fn from(t: BundleFileType) -> u64 {
match t {
$(
BundleFileType::$variant => $hash,
)+
BundleFileType::Unknown(hash) => hash.into(),
}
}
}
}
}
make_enum! {
AnimationCurves, 0xdcfb9e18fff13984, "animation_curves";
Animation, 0x931e336d7646cc26, "animation";
Apb, 0x3eed05ba83af5090, "apb";
BakedLighting, 0x7ffdb779b04e4ed1, "baked_lighting";
Bik, 0xaa5965f03029fa18, "bik";
BlendSet, 0xe301e8af94e3b5a3, "blend_set";
Bones, 0x18dead01056b72e9, "bones";
Chroma, 0xb7893adf7567506a, "chroma";
CommonPackage, 0xfe9754bd19814a47, "common_package";
Config, 0x82645835e6b73232, "config";
Crypto, 0x69108ded1e3e634b, "crypto";
Data, 0x8fd0d44d20650b68, "data";
Entity, 0x9831ca893b0d087d, "entity";
Flow, 0x92d3ee038eeb610d, "flow";
Font, 0x9efe0a916aae7880, "font";
Ies, 0x8f7d5a2c0f967655, "ies";
Ini, 0xd526a27da14f1dc5, "ini";
Input, 0x2bbcabe5074ade9e, "input";
Ivf, 0xfa4a8e091a91201e, "ivf";
Keys, 0xa62f9297dc969e85, "keys";
Level, 0x2a690fd348fe9ac5, "level";
Lua, 0xa14e8dfa2cd117e2, "lua";
Material, 0xeac0b497876adedf, "material";
Mod, 0x3fcdd69156a46417, "mod";
MouseCursor, 0xb277b11fe4a61d37, "mouse_cursor";
NavData, 0x169de9566953d264, "nav_data";
NetworkConfig, 0x3b1fa9e8f6bac374, "network_config";
OddleNet, 0xb0f2c12eb107f4d8, "oodle_net";
Package, 0xad9c6d9ed1e5e77a, "package";
Particles, 0xa8193123526fad64, "particles";
PhysicsProperties, 0xbf21403a3ab0bbb1, "physics_properties";
RenderConfig, 0x27862fe24795319c, "render_config";
RtPipeline, 0x9ca183c2d0e76dee, "rt_pipeline";
Scene, 0x9d0a795bfe818d19, "scene";
Shader, 0xcce8d5b5f5ae333f, "shader";
ShaderLibrary, 0xe5ee32a477239a93, "shader_library";
ShaderLibraryGroup, 0x9e5c3cc74575aeb5, "shader_library_group";
ShadingEnvionmentMapping, 0x250e0a11ac8e26f8, "shading_envionment_mapping";
ShadingEnvironment, 0xfe73c7dcff8a7ca5, "shading_environment";
Slug, 0xa27b4d04a9ba6f9e, "slug";
SlugAlbum, 0xe9fc9ea7042e5ec0, "slug_album";
SoundEnvironment, 0xd8b27864a97ffdd7, "sound_environment";
SpuJob, 0xf97af9983c05b950, "spu_job";
StateMachine, 0xa486d4045106165c, "state_machine";
StaticPVS, 0xe3f0baa17d620321, "static_pvs";
Strings, 0x0d972bab10b40fd3, "strings";
SurfaceProperties, 0xad2d3fa30d9ab394, "surface_properties";
Texture, 0xcd4238c6a0c69e32, "texture", "dds";
TimpaniBank, 0x99736be1fff739a4, "timpani_bank";
TimpaniMaster, 0x00a3e6c59a2b9c6c, "timpani_master";
Tome, 0x19c792357c99f49b, "tome";
Ugg, 0x712d6e3dd1024c9c, "ugg";
Unit, 0xe0a48d0be9a7453f, "unit";
Upb, 0xa99510c6e86dd3c2, "upb";
VectorField, 0xf7505933166d6755, "vector_field";
Wav, 0x786f65c00a816b19, "wav";
WwiseBank, 0x535a7bd3e650d799, "wwise_bank", "bnk";
WwiseDep, 0xaf32095c82f2b070, "wwise_dep";
WwiseEvent, 0xaabdd317b58dfc8a, "wwise_event";
WwiseMetadata, 0xd50a8b7e1c82b110, "wwise_metadata";
WwiseStream, 0x504b55235d21440e, "wwise_stream", "ogg";
Xml, 0x76015845a6003765, "xml";
Theme, 0x38BB9442048A7FBD, "theme";
MissionThemes, 0x80F2DE893657F83A, "mission_themes";
}
impl BundleFileType {
pub fn hash(&self) -> Murmur64 {
Murmur64::from(*self)
}
}
impl Serialize for BundleFileType {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
let value = self.ext_name();
value.serialize(serializer)
}
}
impl From<Murmur64> for BundleFileType {
fn from(value: Murmur64) -> Self {
Self::from(Into::<u64>::into(value))
}
}
impl From<BundleFileType> for Murmur64 {
fn from(t: BundleFileType) -> Murmur64 {
let hash: u64 = t.into();
Murmur64::from(hash)
}
}
impl std::fmt::Display for BundleFileType {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.ext_name())
}
}

View file

@ -7,14 +7,13 @@ use color_eyre::{Help, Report, SectionExt};
use oodle::{OodleLZ_CheckCRC, OodleLZ_FuzzSafe, CHUNK_SIZE}; use oodle::{OodleLZ_CheckCRC, OodleLZ_FuzzSafe, CHUNK_SIZE};
use crate::binary::sync::*; use crate::binary::sync::*;
use crate::bundle::file::Properties;
use crate::murmur::{HashGroup, IdString64, Murmur64}; use crate::murmur::{HashGroup, IdString64, Murmur64};
pub(crate) mod database; pub(crate) mod database;
pub(crate) mod file; pub(crate) mod file;
pub(crate) mod filetype;
pub use file::{BundleFile, BundleFileVariant, Properties}; pub use file::{BundleFile, BundleFileType, BundleFileVariant};
pub use filetype::BundleFileType;
#[derive(Clone, Copy, Debug, PartialEq, PartialOrd)] #[derive(Clone, Copy, Debug, PartialEq, PartialOrd)]
enum BundleFormat { enum BundleFormat {
@ -162,7 +161,6 @@ impl Bundle {
// TODO: Optimize to not reallocate? // TODO: Optimize to not reallocate?
let mut raw_buffer = oodle::decompress( let mut raw_buffer = oodle::decompress(
&compressed_buffer, &compressed_buffer,
oodle::CHUNK_SIZE,
OodleLZ_FuzzSafe::No, OodleLZ_FuzzSafe::No,
OodleLZ_CheckCRC::No, OodleLZ_CheckCRC::No,
) )
@ -237,7 +235,7 @@ impl Bundle {
// Ceiling division (or division toward infinity) to calculate // Ceiling division (or division toward infinity) to calculate
// the number of chunks required to fit the unpacked data. // the number of chunks required to fit the unpacked data.
let num_chunks = unpacked_data.len().div_ceil(CHUNK_SIZE); let num_chunks = (unpacked_data.len() + CHUNK_SIZE - 1) / CHUNK_SIZE;
tracing::trace!(num_chunks); tracing::trace!(num_chunks);
w.write_u32(num_chunks as u32)?; w.write_u32(num_chunks as u32)?;
@ -360,7 +358,6 @@ where
// TODO: Optimize to not reallocate? // TODO: Optimize to not reallocate?
let mut raw_buffer = oodle::decompress( let mut raw_buffer = oodle::decompress(
&compressed_buffer, &compressed_buffer,
oodle::CHUNK_SIZE,
OodleLZ_FuzzSafe::No, OodleLZ_FuzzSafe::No,
OodleLZ_CheckCRC::No, OodleLZ_CheckCRC::No,
)?; )?;

View file

@ -1,11 +1,8 @@
use std::ffi::OsString;
use std::path::PathBuf;
use std::process::Command; use std::process::Command;
use std::sync::Arc; use std::{ffi::OsString, path::PathBuf};
use crate::murmur::{Dictionary, HashGroup, IdString64, Murmur32, Murmur64}; use crate::murmur::{Dictionary, HashGroup, IdString64, Murmur32, Murmur64};
#[derive(Clone)]
pub struct CmdLine { pub struct CmdLine {
cmd: OsString, cmd: OsString,
args: Vec<OsString>, args: Vec<OsString>,
@ -55,7 +52,7 @@ impl From<&CmdLine> for Command {
} }
pub struct Context { pub struct Context {
pub lookup: Arc<Dictionary>, pub lookup: Dictionary,
pub ljd: Option<CmdLine>, pub ljd: Option<CmdLine>,
pub revorb: Option<String>, pub revorb: Option<String>,
pub ww2ogg: Option<String>, pub ww2ogg: Option<String>,
@ -65,7 +62,7 @@ pub struct Context {
impl Context { impl Context {
pub fn new() -> Self { pub fn new() -> Self {
Self { Self {
lookup: Arc::new(Dictionary::new()), lookup: Dictionary::new(),
ljd: None, ljd: None,
revorb: None, revorb: None,
ww2ogg: None, ww2ogg: None,

View file

@ -15,7 +15,6 @@ use tokio::fs;
use crate::binary::sync::ReadExt; use crate::binary::sync::ReadExt;
use crate::binary::sync::WriteExt; use crate::binary::sync::WriteExt;
use crate::bundle::file::{BundleFileVariant, UserFile}; use crate::bundle::file::{BundleFileVariant, UserFile};
use crate::murmur::IdString64;
use crate::{BundleFile, BundleFileType}; use crate::{BundleFile, BundleFileType};
const BITSQUID_LUAJIT_HEADER: u32 = 0x8253461B; const BITSQUID_LUAJIT_HEADER: u32 = 0x8253461B;
@ -118,22 +117,26 @@ where
} }
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
pub fn compile(name: impl Into<IdString64>, code: impl AsRef<str>) -> Result<BundleFile> { pub fn compile<S, C>(name: S, code: C) -> Result<BundleFile>
where
S: Into<String>,
C: AsRef<str>,
{
let name = name.into(); let name = name.into();
let code = code.as_ref(); let code = code.as_ref();
tracing::trace!( tracing::trace!(
"Compiling '{}', {} bytes of code", "Compiling '{}', {} bytes of code",
name.display(), name,
code.len() code.as_bytes().len()
); );
let bytecode = unsafe { let bytecode = unsafe {
let state = lua::luaL_newstate(); let state = lua::luaL_newstate();
lua::luaL_openlibs(state); lua::luaL_openlibs(state);
let name = CString::new(format!("@{}", name.display()).into_bytes()) let name = CString::new(format!("@{name}").into_bytes())
.wrap_err_with(|| format!("Cannot convert name into CString: {}", name.display()))?; .wrap_err_with(|| format!("Cannot convert name into CString: {}", name))?;
match lua::luaL_loadbuffer( match lua::luaL_loadbuffer(
state, state,
code.as_ptr() as _, code.as_ptr() as _,
@ -156,10 +159,10 @@ pub fn compile(name: impl Into<IdString64>, code: impl AsRef<str>) -> Result<Bun
} }
_ => unreachable!(), _ => unreachable!(),
} }
lua::lua_setglobal(state, c"fn".as_ptr()); lua::lua_setglobal(state, b"fn\0".as_ptr() as _);
let run = c"return string.dump(fn, false)"; let run = b"return string.dump(fn, false)\0";
match lua::luaL_loadstring(state, run.as_ptr()) as u32 { match lua::luaL_loadstring(state, run.as_ptr() as _) as u32 {
lua::LUA_OK => {} lua::LUA_OK => {}
lua::LUA_ERRSYNTAX => { lua::LUA_ERRSYNTAX => {
let err = lua::lua_tostring(state, -1); let err = lua::lua_tostring(state, -1);

View file

@ -7,22 +7,13 @@ use std::str::FromStr;
use async_recursion::async_recursion; use async_recursion::async_recursion;
use color_eyre::eyre::{self, Context}; use color_eyre::eyre::{self, Context};
use color_eyre::Result; use color_eyre::Result;
use path_slash::PathBufExt;
use tokio::fs; use tokio::fs;
use crate::binary::sync::{ReadExt, WriteExt}; use crate::binary::sync::{ReadExt, WriteExt};
use crate::bundle::file::UserFile; use crate::bundle::file::{BundleFileType, UserFile};
use crate::bundle::filetype::BundleFileType; use crate::murmur::{HashGroup, Murmur64};
use crate::murmur::{HashGroup, IdString64, Murmur64};
/// Resolves a relative path that might contain wildcards into a list of
/// paths that exist on disk and match that wildcard.
/// This is similar to globbing in Unix shells, but with much less features.
///
/// The only wilcard character allowed is `*`, and only at the end of the string,
/// where it matches all files recursively in that directory.
///
/// `t` is an optional extension name, that may be used to force a wildcard
/// path to only match that file type `t`.
#[tracing::instrument] #[tracing::instrument]
#[async_recursion] #[async_recursion]
async fn resolve_wildcard<P1, P2>( async fn resolve_wildcard<P1, P2>(
@ -99,12 +90,12 @@ where
Ok(paths) Ok(paths)
} }
type PackageType = HashMap<BundleFileType, HashSet<String>>; type PackageType = HashMap<BundleFileType, HashSet<PathBuf>>;
type PackageDefinition = HashMap<String, HashSet<String>>; type PackageDefinition = HashMap<String, HashSet<String>>;
#[derive(Default)] #[derive(Default)]
pub struct Package { pub struct Package {
_name: IdString64, _name: String,
_root: PathBuf, _root: PathBuf,
inner: PackageType, inner: PackageType,
flags: u8, flags: u8,
@ -125,9 +116,9 @@ impl DerefMut for Package {
} }
impl Package { impl Package {
pub fn new(name: impl Into<IdString64>, root: PathBuf) -> Self { pub fn new(name: String, root: PathBuf) -> Self {
Self { Self {
_name: name.into(), _name: name,
_root: root, _root: root,
inner: Default::default(), inner: Default::default(),
flags: 1, flags: 1,
@ -138,22 +129,17 @@ impl Package {
self.values().fold(0, |total, files| total + files.len()) self.values().fold(0, |total, files| total + files.len())
} }
pub fn add_file(&mut self, file_type: BundleFileType, name: impl Into<String>) { pub fn add_file<P: Into<PathBuf>>(&mut self, file_type: BundleFileType, name: P) {
self.inner.entry(file_type).or_default().insert(name.into()); self.inner.entry(file_type).or_default().insert(name.into());
} }
#[tracing::instrument("Package::from_sjson", skip(sjson), fields(sjson_len = sjson.as_ref().len()))] #[tracing::instrument("Package::from_sjson", skip(sjson), fields(sjson_len = sjson.as_ref().len()))]
pub async fn from_sjson<P, S>( pub async fn from_sjson<P, S>(sjson: S, name: String, root: P) -> Result<Self>
sjson: S,
name: impl Into<IdString64> + std::fmt::Debug,
root: P,
) -> Result<Self>
where where
P: AsRef<Path> + std::fmt::Debug, P: AsRef<Path> + std::fmt::Debug,
S: AsRef<str>, S: AsRef<str>,
{ {
let root = root.as_ref(); let root = root.as_ref();
let name = name.into();
let definition: PackageDefinition = serde_sjson::from_str(sjson.as_ref())?; let definition: PackageDefinition = serde_sjson::from_str(sjson.as_ref())?;
let mut inner: PackageType = Default::default(); let mut inner: PackageType = Default::default();
@ -187,11 +173,7 @@ impl Package {
continue; continue;
}; };
tracing::debug!("Adding file {}", path.display()); inner.entry(t).or_default().insert(path);
inner
.entry(t)
.or_default()
.insert(path.display().to_string());
} }
} }
} }
@ -210,9 +192,11 @@ impl Package {
pub fn to_sjson(&self) -> Result<String> { pub fn to_sjson(&self) -> Result<String> {
let mut map: PackageDefinition = Default::default(); let mut map: PackageDefinition = Default::default();
for (t, names) in self.iter() { for (t, paths) in self.iter() {
for name in names.iter() { for path in paths.iter() {
map.entry(t.ext_name()).or_default().insert(name.clone()); map.entry(t.ext_name())
.or_default()
.insert(path.display().to_string());
} }
} }
@ -238,11 +222,11 @@ impl Package {
for _ in 0..file_count { for _ in 0..file_count {
let t = BundleFileType::from(r.read_u64()?); let t = BundleFileType::from(r.read_u64()?);
let hash = Murmur64::from(r.read_u64()?); let hash = Murmur64::from(r.read_u64()?);
let name = ctx.lookup_hash(hash, HashGroup::Filename); let path = ctx.lookup_hash(hash, HashGroup::Filename);
inner inner
.entry(t) .entry(t)
.or_default() .or_default()
.insert(name.display().to_string()); .insert(PathBuf::from(path.display().to_string()));
} }
let flags = r.read_u8()?; let flags = r.read_u8()?;
@ -255,7 +239,7 @@ impl Package {
let pkg = Self { let pkg = Self {
inner, inner,
_name: name.into(), _name: name,
_root: PathBuf::new(), _root: PathBuf::new(),
flags, flags,
}; };
@ -271,10 +255,12 @@ impl Package {
w.write_u32(0x2b)?; w.write_u32(0x2b)?;
w.write_u32(self.values().flatten().count() as u32)?; w.write_u32(self.values().flatten().count() as u32)?;
for (t, names) in self.iter() { for (t, paths) in self.iter() {
for name in names.iter() { for path in paths.iter() {
w.write_u64(t.hash().into())?; w.write_u64(t.hash().into())?;
w.write_u64(Murmur64::hash(name.as_bytes()).into())?;
let hash = Murmur64::hash(path.to_slash_lossy().as_bytes());
w.write_u64(hash.into())?;
} }
} }
@ -294,11 +280,17 @@ where
Ok(vec![UserFile::new(s.into_bytes())]) Ok(vec![UserFile::new(s.into_bytes())])
} }
// #[tracing::instrument(skip_all)]
// pub fn compile(_ctx: &crate::Context, data: String) -> Result<Vec<u8>> {
// let pkg = Package::from_sjson(data)?;
// pkg.to_binary()
// }
#[cfg(test)] #[cfg(test)]
mod test { mod test {
use std::path::PathBuf; use std::path::PathBuf;
use crate::bundle::filetype::BundleFileType; use crate::BundleFileType;
use super::resolve_wildcard; use super::resolve_wildcard;
use super::Package; use super::Package;

View file

@ -28,14 +28,10 @@ impl Language {
#[derive(serde::Serialize)] #[derive(serde::Serialize)]
pub struct Strings(HashMap<String, HashMap<Language, String>>); pub struct Strings(HashMap<String, HashMap<Language, String>>);
#[inline(always)]
fn read_string<R>(r: R) -> Result<String> fn read_string<R>(r: R) -> Result<String>
where where
R: Read, R: Read,
{ {
// We can safely ignore the warning here, as all data is already in memory, and no additional
// `BufReader` should be needed.
#[allow(clippy::unbuffered_bytes)]
r.bytes() r.bytes()
.take_while(|b| b.as_ref().map(|b| *b != 0).unwrap_or(false)) .take_while(|b| b.as_ref().map(|b| *b != 0).unwrap_or(false))
.map(|b| b.map_err(Report::new)) .map(|b| b.map_err(Report::new))
@ -45,7 +41,7 @@ where
impl Strings { impl Strings {
#[tracing::instrument(skip_all, fields(languages = variants.len()))] #[tracing::instrument(skip_all, fields(languages = variants.len()))]
pub fn from_variants(ctx: &crate::Context, variants: &[BundleFileVariant]) -> Result<Self> { pub fn from_variants(ctx: &crate::Context, variants: &Vec<BundleFileVariant>) -> Result<Self> {
let mut map: HashMap<String, HashMap<Language, String>> = HashMap::new(); let mut map: HashMap<String, HashMap<Language, String>> = HashMap::new();
for (i, variant) in variants.iter().enumerate() { for (i, variant) in variants.iter().enumerate() {
@ -80,7 +76,7 @@ impl Strings {
} }
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
pub fn decompile(ctx: &crate::Context, variants: &[BundleFileVariant]) -> Result<Vec<UserFile>> { pub fn decompile(ctx: &crate::Context, variants: &Vec<BundleFileVariant>) -> Result<Vec<UserFile>> {
let strings = Strings::from_variants(ctx, variants)?; let strings = Strings::from_variants(ctx, variants)?;
let content = strings.to_sjson()?; let content = strings.to_sjson()?;

View file

@ -9,5 +9,5 @@ pub mod murmur;
pub use binary::{FromBinary, ToBinary}; pub use binary::{FromBinary, ToBinary};
pub use bundle::database::BundleDatabase; pub use bundle::database::BundleDatabase;
pub use bundle::decompress; pub use bundle::decompress;
pub use bundle::{Bundle, BundleFile, BundleFileType, BundleFileVariant, Properties}; pub use bundle::{Bundle, BundleFile, BundleFileType, BundleFileVariant};
pub use context::{CmdLine, Context}; pub use context::{CmdLine, Context};

View file

@ -48,7 +48,6 @@ struct Row {
group: HashGroup, group: HashGroup,
} }
#[derive(Clone)]
pub struct Entry { pub struct Entry {
value: String, value: String,
long: Murmur64, long: Murmur64,
@ -74,7 +73,6 @@ impl Entry {
} }
} }
#[derive(Clone)]
pub struct Dictionary { pub struct Dictionary {
entries: Vec<Entry>, entries: Vec<Entry>,
} }
@ -90,12 +88,10 @@ impl Dictionary {
Self { entries: vec![] } Self { entries: vec![] }
} }
pub async fn from_csv<R>(r: R) -> Result<Self> pub async fn from_csv<R>(&mut self, r: R) -> Result<()>
where where
R: AsyncRead + std::marker::Unpin + std::marker::Send, R: AsyncRead + std::marker::Unpin + std::marker::Send,
{ {
let mut entries = vec![];
let r = AsyncDeserializer::from_reader(r); let r = AsyncDeserializer::from_reader(r);
let mut records = r.into_deserialize::<Row>(); let mut records = r.into_deserialize::<Row>();
@ -116,10 +112,10 @@ impl Dictionary {
group: record.group, group: record.group,
}; };
entries.push(entry); self.entries.push(entry);
} }
Ok(Self { entries }) Ok(())
} }
pub async fn to_csv<W>(&self, w: W) -> Result<()> pub async fn to_csv<W>(&self, w: W) -> Result<()>
@ -151,21 +147,21 @@ impl Dictionary {
Ok(()) Ok(())
} }
pub fn add(&mut self, value: impl AsRef<[u8]>, group: HashGroup) { pub fn add(&mut self, value: String, group: HashGroup) {
let long = Murmur64::from(murmurhash64::hash(value.as_ref(), SEED as u64)); let long = Murmur64::from(murmurhash64::hash(value.as_bytes(), SEED as u64));
let short = Murmur32::from(murmurhash64::hash32(value.as_ref(), SEED)); let short = Murmur32::from(murmurhash64::hash32(value.as_bytes(), SEED));
let entry = Entry { let entry = Entry {
long, long,
short, short,
value: String::from_utf8_lossy(value.as_ref()).to_string(), value,
group, group,
}; };
self.entries.push(entry); self.entries.push(entry);
} }
pub fn find(&self, value: &String, group: HashGroup) -> Option<&Entry> { pub fn find(&mut self, value: &String, group: HashGroup) -> Option<&Entry> {
self.entries self.entries
.iter() .iter()
.find(|e| e.value == *value && e.group == group) .find(|e| e.value == *value && e.group == group)

View file

@ -1,162 +0,0 @@
use std::fmt;
use serde::{Deserializer, Serializer};
use super::Murmur32;
// This type encodes the fact that when reading in a bundle, we don't always have a dictionary
// entry for every hash in there. So we do want to have the real string available when needed,
// but at the same time retain the original hash information for when we don't.
// This is especially important when wanting to write back the read bundle, as the hashes need to
// stay the same.
// The previous system of always turning hashes into strings worked well for the purpose of
// displaying hashes, but would have made it very hard to turn a stringyfied hash back into
// an actual hash.
#[derive(Clone, Debug, Eq)]
pub enum IdString32 {
Hash(Murmur32),
String(String),
}
impl IdString32 {
pub fn to_murmur32(&self) -> Murmur32 {
match self {
Self::Hash(hash) => *hash,
Self::String(s) => Murmur32::hash(s.as_bytes()),
}
}
pub fn display(&self) -> IdString32Display {
let s = match self {
IdString32::Hash(hash) => hash.to_string(),
IdString32::String(s) => s.clone(),
};
IdString32Display(s)
}
pub fn is_string(&self) -> bool {
match self {
IdString32::Hash(_) => false,
IdString32::String(_) => true,
}
}
pub fn is_hash(&self) -> bool {
match self {
IdString32::Hash(_) => true,
IdString32::String(_) => false,
}
}
}
impl From<String> for IdString32 {
fn from(value: String) -> Self {
Self::String(value)
}
}
impl From<u32> for IdString32 {
fn from(value: u32) -> Self {
Self::Hash(value.into())
}
}
impl From<IdString32> for u32 {
fn from(value: IdString32) -> Self {
value.to_murmur32().into()
}
}
impl From<Murmur32> for IdString32 {
fn from(value: Murmur32) -> Self {
Self::Hash(value)
}
}
impl From<IdString32> for Murmur32 {
fn from(value: IdString32) -> Self {
value.to_murmur32()
}
}
impl PartialEq for IdString32 {
fn eq(&self, other: &Self) -> bool {
self.to_murmur32() == other.to_murmur32()
}
}
impl std::hash::Hash for IdString32 {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
state.write_u32(self.to_murmur32().into());
}
}
impl serde::Serialize for IdString32 {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_u32(self.to_murmur32().into())
}
}
struct IdString32Visitor;
impl<'de> serde::de::Visitor<'de> for IdString32Visitor {
type Value = IdString32;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an u32 or a string")
}
fn visit_u32<E>(self, value: u32) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString32::Hash(value.into()))
}
fn visit_str<E>(self, v: &str) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString32::String(v.to_string()))
}
fn visit_string<E>(self, v: String) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString32::String(v))
}
}
impl<'de> serde::Deserialize<'de> for IdString32 {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
deserializer.deserialize_u32(IdString32Visitor)
}
}
pub struct IdString32Display(String);
impl std::fmt::Display for IdString32Display {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
impl std::fmt::UpperHex for IdString32 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::UpperHex::fmt(&self.to_murmur32(), f)
}
}
impl std::fmt::LowerHex for IdString32 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::LowerHex::fmt(&self.to_murmur32(), f)
}
}

View file

@ -1,175 +0,0 @@
use std::{fmt, path::Path};
use path_slash::PathExt as _;
use serde::{Deserializer, Serializer};
use super::Murmur64;
// This type encodes the fact that when reading in a bundle, we don't always have a dictionary
// entry for every hash in there. So we do want to have the real string available when needed,
// but at the same time retain the original hash information for when we don't.
// This is especially important when wanting to write back the read bundle, as the hashes need to
// stay the same.
// The previous system of always turning hashes into strings worked well for the purpose of
// displaying hashes, but would have made it very hard to turn a stringyfied hash back into
// an actual hash.
#[derive(Clone, Debug, Eq)]
pub enum IdString64 {
Hash(Murmur64),
String(String),
}
impl IdString64 {
pub fn to_murmur64(&self) -> Murmur64 {
match self {
Self::Hash(hash) => *hash,
Self::String(s) => Murmur64::hash(s.as_bytes()),
}
}
pub fn display(&self) -> IdString64Display {
let s = match self {
IdString64::Hash(hash) => hash.to_string(),
IdString64::String(s) => s.clone(),
};
IdString64Display(s)
}
pub fn is_string(&self) -> bool {
match self {
IdString64::Hash(_) => false,
IdString64::String(_) => true,
}
}
pub fn is_hash(&self) -> bool {
match self {
IdString64::Hash(_) => true,
IdString64::String(_) => false,
}
}
// Would love to have this as a proper `impl From`, but
// rustc will complain that it overlaps with the `impl From<Into<String>>`.
pub fn from_path(p: impl AsRef<Path>) -> Self {
Self::String(p.as_ref().to_slash_lossy().to_string())
}
}
impl From<String> for IdString64 {
fn from(value: String) -> Self {
Self::String(value)
}
}
impl From<u64> for IdString64 {
fn from(value: u64) -> Self {
Self::Hash(value.into())
}
}
impl From<Murmur64> for IdString64 {
fn from(value: Murmur64) -> Self {
Self::Hash(value)
}
}
impl From<IdString64> for Murmur64 {
fn from(value: IdString64) -> Self {
value.to_murmur64()
}
}
impl From<IdString64> for u64 {
fn from(value: IdString64) -> Self {
value.to_murmur64().into()
}
}
impl Default for IdString64 {
fn default() -> Self {
Self::Hash(0.into())
}
}
impl PartialEq for IdString64 {
fn eq(&self, other: &Self) -> bool {
self.to_murmur64() == other.to_murmur64()
}
}
impl std::hash::Hash for IdString64 {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
state.write_u64(self.to_murmur64().into());
}
}
impl serde::Serialize for IdString64 {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_u64(self.to_murmur64().into())
}
}
struct IdString64Visitor;
impl<'de> serde::de::Visitor<'de> for IdString64Visitor {
type Value = IdString64;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an u64 or a string")
}
fn visit_u64<E>(self, value: u64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::Hash(value.into()))
}
fn visit_str<E>(self, v: &str) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::String(v.to_string()))
}
fn visit_string<E>(self, v: String) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::String(v))
}
}
impl<'de> serde::Deserialize<'de> for IdString64 {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
deserializer.deserialize_u64(IdString64Visitor)
}
}
pub struct IdString64Display(String);
impl std::fmt::Display for IdString64Display {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
impl std::fmt::UpperHex for IdString64 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::UpperHex::fmt(&self.to_murmur64(), f)
}
}
impl std::fmt::LowerHex for IdString64 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::LowerHex::fmt(&self.to_murmur64(), f)
}
}

View file

@ -8,8 +8,6 @@ use serde::{Deserialize, Deserializer, Serialize, Serializer};
mod dictionary; mod dictionary;
// Currently unused // Currently unused
// mod murmurhash32; // mod murmurhash32;
mod idstring32;
mod idstring64;
mod murmurhash64; mod murmurhash64;
mod types; mod types;
mod util; mod util;
@ -17,8 +15,6 @@ mod util;
pub const SEED: u32 = 0; pub const SEED: u32 = 0;
pub use dictionary::{Dictionary, Entry, HashGroup}; pub use dictionary::{Dictionary, Entry, HashGroup};
pub use idstring32::*;
pub use idstring64::*;
pub use murmurhash64::hash; pub use murmurhash64::hash;
pub use murmurhash64::hash32; pub use murmurhash64::hash32;
pub use murmurhash64::hash_inverse as inverse; pub use murmurhash64::hash_inverse as inverse;

View file

@ -119,9 +119,4 @@ fn test_hash() {
} }
#[test] #[test]
fn test_inverse() { fn test_inverse() {}
let h = hash("lua".as_bytes(), crate::murmur::SEED as u64);
let inv = hash_inverse(h, crate::murmur::SEED as u64);
assert_eq!(h, hash(&inv.to_le_bytes(), crate::murmur::SEED as u64));
assert_ne!(h, hash(&inv.to_be_bytes(), crate::murmur::SEED as u64));
}

View file

@ -50,7 +50,7 @@ impl fmt::LowerHex for Murmur64 {
impl fmt::Display for Murmur64 { impl fmt::Display for Murmur64 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{self:016X}") write!(f, "{:016X}", self)
} }
} }
@ -150,15 +150,9 @@ impl fmt::UpperHex for Murmur32 {
} }
} }
impl fmt::LowerHex for Murmur32 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt::LowerHex::fmt(&self.0, f)
}
}
impl fmt::Display for Murmur32 { impl fmt::Display for Murmur32 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{self:08X}") write!(f, "{:08X}", self)
} }
} }
@ -224,3 +218,148 @@ impl<'de> Deserialize<'de> for Murmur32 {
deserializer.deserialize_any(Self(0)) deserializer.deserialize_any(Self(0))
} }
} }
// This type encodes the fact that when reading in a bundle, we don't always have a dictionary
// entry for every hash in there. So we do want to have the real string available when needed,
// but at the same time retain the original hash information for when we don't.
// This is especially important when wanting to write back the read bundle, as the hashes need to
// stay the same.
// The previous system of always turning hashes into strings worked well for the purpose of
// displaying hashes, but would have made it very hard to turn a stringyfied hash back into
// an actual hash.
#[derive(Clone, Debug, Eq)]
pub enum IdString64 {
Hash(Murmur64),
String(String),
}
impl IdString64 {
pub fn to_murmur64(&self) -> Murmur64 {
match self {
Self::Hash(hash) => *hash,
Self::String(s) => Murmur64::hash(s.as_bytes()),
}
}
pub fn display(&self) -> IdString64Display {
let s = match self {
IdString64::Hash(hash) => hash.to_string(),
IdString64::String(s) => s.clone(),
};
IdString64Display(s)
}
pub fn is_string(&self) -> bool {
match self {
IdString64::Hash(_) => false,
IdString64::String(_) => true,
}
}
pub fn is_hash(&self) -> bool {
match self {
IdString64::Hash(_) => true,
IdString64::String(_) => false,
}
}
}
impl<S: Into<String>> From<S> for IdString64 {
fn from(value: S) -> Self {
Self::String(value.into())
}
}
impl From<Murmur64> for IdString64 {
fn from(value: Murmur64) -> Self {
Self::Hash(value)
}
}
impl From<IdString64> for Murmur64 {
fn from(value: IdString64) -> Self {
value.to_murmur64()
}
}
impl PartialEq for IdString64 {
fn eq(&self, other: &Self) -> bool {
self.to_murmur64() == other.to_murmur64()
}
}
impl std::hash::Hash for IdString64 {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
state.write_u64(self.to_murmur64().into());
}
}
impl serde::Serialize for IdString64 {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_u64(self.to_murmur64().into())
}
}
struct IdString64Visitor;
impl<'de> serde::de::Visitor<'de> for IdString64Visitor {
type Value = IdString64;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an u64 or a string")
}
fn visit_u64<E>(self, value: u64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::Hash(value.into()))
}
fn visit_str<E>(self, v: &str) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::String(v.to_string()))
}
fn visit_string<E>(self, v: String) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(IdString64::String(v))
}
}
impl<'de> serde::Deserialize<'de> for IdString64 {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
deserializer.deserialize_u64(IdString64Visitor)
}
}
pub struct IdString64Display(String);
impl std::fmt::Display for IdString64Display {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
impl std::fmt::UpperHex for IdString64 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::UpperHex::fmt(&self.to_murmur64(), f)
}
}
impl std::fmt::LowerHex for IdString64 {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
std::fmt::LowerHex::fmt(&self.to_murmur64(), f)
}
}

1
lib/serde_sjson Submodule

@ -0,0 +1 @@
Subproject commit 73d2b23ce50e75b184f5092ad515e97a0adbe6da