Aida on Nostr: So it wasn't as easy as I expected. For some reason it can't be done by piping ...
So it wasn't as easy as I expected. For some reason it can't be done by piping eventstore query output into another eventstore command. Not sure why. From about 400k of events only like 12 saved that way...
So I did it in multiple steps.
First export to jsonl file:
`./eventstore -d db -t badger query '{}' > export.jsonl`
Then I had to increase the stack size from 8 to 32MB:
`ulimit -s 32768`
And on the end I just run a script which called eventstore and inserted every line/note into SQLite:
```
#!/usr/bin/env zsh
if [ $# -lt 1 ]; then
echo "Usage: $0 <file.jsonl>"
exit 1
fi
input_file="$1"
while IFS= read -r line; do
./eventstore -d export.sqlite -t sqlite save "$line"
done < "$input_file"
```
So I did it in multiple steps.
First export to jsonl file:
`./eventstore -d db -t badger query '{}' > export.jsonl`
Then I had to increase the stack size from 8 to 32MB:
`ulimit -s 32768`
And on the end I just run a script which called eventstore and inserted every line/note into SQLite:
```
#!/usr/bin/env zsh
if [ $# -lt 1 ]; then
echo "Usage: $0 <file.jsonl>"
exit 1
fi
input_file="$1"
while IFS= read -r line; do
./eventstore -d export.sqlite -t sqlite save "$line"
done < "$input_file"
```