## Supported formats
[fq -rn -L . 'include "formats"; formats_table']: sh-start
|Name |Description |Dependencies|
|- |- |-|
|[`aac_frame`](#aac_frame) |Advanced Audio Coding frame ||
|`adts` |Audio Data Transport Stream |`adts_frame`|
|`adts_frame` |Audio Data Transport Stream frame |`aac_frame`|
|`aiff` |Audio Interchange File Format ||
|`amf0` |Action Message Format 0 ||
|`apev2` |APEv2 metadata tag |`image`|
|[`apple_bookmark`](#apple_bookmark) |Apple BookmarkData ||
|`ar` |Unix archive |`probe`|
|[`asn1_ber`](#asn1_ber) |ASN1 BER (basic encoding rules, also CER and DER) ||
|`av1_ccr` |AV1 Codec Configuration Record ||
|`av1_frame` |AV1 frame |`av1_obu`|
|`av1_obu` |AV1 Open Bitstream Unit ||
|`avc_annexb` |H.264/AVC Annex B |`avc_nalu`|
|[`avc_au`](#avc_au) |H.264/AVC Access Unit |`avc_nalu`|
|`avc_dcr` |H.264/AVC Decoder Configuration Record |`avc_nalu`|
|`avc_nalu` |H.264/AVC Network Access Layer Unit |`avc_sps` `avc_pps` `avc_sei`|
|`avc_pps` |H.264/AVC Picture Parameter Set ||
|`avc_sei` |H.264/AVC Supplemental Enhancement Information ||
|`avc_sps` |H.264/AVC Sequence Parameter Set ||
|[`avi`](#avi) |Audio Video Interleaved |`avc_au` `hevc_au` `mp3_frame` `flac_frame`|
|[`avro_ocf`](#avro_ocf) |Avro object container file ||
|[`bencode`](#bencode) |BitTorrent bencoding ||
|`bitcoin_blkdat` |Bitcoin blk.dat |`bitcoin_block`|
|[`bitcoin_block`](#bitcoin_block) |Bitcoin block |`bitcoin_transaction`|
|`bitcoin_script` |Bitcoin script ||
|`bitcoin_transaction` |Bitcoin transaction |`bitcoin_script`|
|[`bits`](#bits) |Raw bits ||
|[`bplist`](#bplist) |Apple Binary Property List ||
|`bsd_loopback_frame` |BSD loopback frame |`inet_packet`|
|[`bson`](#bson) |Binary JSON ||
|[`bytes`](#bytes) |Raw bytes ||
|`bzip2` |bzip2 compression |`probe`|
|[`caff`](#caff) |Live2D Cubism archive |`probe`|
|[`cbor`](#cbor) |Concise Binary Object Representation ||
|[`csv`](#csv) |Comma separated values ||
|`dns` |DNS packet ||
|`dns_tcp` |DNS packet (TCP) ||
|`elf` |Executable and Linkable Format ||
|`ether8023_frame` |Ethernet 802.3 frame |`inet_packet`|
|`exif` |Exchangeable Image File Format ||
|`fairplay_spc` |FairPlay Server Playback Context ||
|`flac` |Free Lossless Audio Codec file |`flac_metadatablocks` `flac_frame`|
|[`flac_frame`](#flac_frame) |FLAC frame ||
|`flac_metadatablock` |FLAC metadatablock |`flac_streaminfo` `flac_picture` `vorbis_comment`|
|`flac_metadatablocks` |FLAC metadatablocks |`flac_metadatablock`|
|`flac_picture` |FLAC metadatablock picture |`image`|
|`flac_streaminfo` |FLAC streaminfo ||
|`gif` |Graphics Interchange Format ||
|`gzip` |gzip compression |`probe`|
|`hevc_annexb` |H.265/HEVC Annex B |`hevc_nalu`|
|[`hevc_au`](#hevc_au) |H.265/HEVC Access Unit |`hevc_nalu`|
|`hevc_dcr` |H.265/HEVC Decoder Configuration Record |`hevc_nalu`|
|`hevc_nalu` |H.265/HEVC Network Access Layer Unit |`hevc_vps` `hevc_pps` `hevc_sps`|
|`hevc_pps` |H.265/HEVC Picture Parameter Set ||
|`hevc_sps` |H.265/HEVC Sequence Parameter Set ||
|`hevc_vps` |H.265/HEVC Video Parameter Set ||
|[`html`](#html) |HyperText Markup Language ||
|`icc_profile` |International Color Consortium profile ||
|`icmp` |Internet Control Message Protocol ||
|`icmpv6` |Internet Control Message Protocol v6 ||
|`id3v1` |ID3v1 metadata ||
|`id3v11` |ID3v1.1 metadata ||
|`id3v2` |ID3v2 metadata |`image`|
|`ipv4_packet` |Internet protocol v4 packet |`ip_packet`|
|`ipv6_packet` |Internet protocol v6 packet |`ip_packet`|
|`jpeg` |Joint Photographic Experts Group file |`exif` `icc_profile`|
|`json` |JavaScript Object Notation ||
|`jsonl` |JavaScript Object Notation Lines ||
|[`leveldb_descriptor`](#leveldb_descriptor) |LevelDB Descriptor ||
|[`leveldb_log`](#leveldb_log) |LevelDB Log ||
|[`leveldb_table`](#leveldb_table) |LevelDB Table ||
|[`luajit`](#luajit) |LuaJIT 2.0 bytecode ||
|[`macho`](#macho) |Mach-O macOS executable ||
|`macho_fat` |Fat Mach-O macOS executable (multi-architecture) |`macho`|
|[`markdown`](#markdown) |Markdown ||
|[`matroska`](#matroska) |Matroska file |`aac_frame` `av1_ccr` `av1_frame` `avc_au` `avc_dcr` `flac_frame` `flac_metadatablocks` `hevc_au` `hevc_dcr` `image` `mp3_frame` `mpeg_asc` `mpeg_pes_packet` `mpeg_spu` `opus_packet` `vorbis_packet` `vp8_frame` `vp9_cfm` `vp9_frame`|
|[`moc3`](#moc3) |MOC3 file ||
|[`mp3`](#mp3) |MP3 file |`id3v2` `id3v1` `id3v11` `apev2` `mp3_frame`|
|`mp3_frame` |MPEG audio layer 3 frame |`mp3_frame_tags`|
|`mp3_frame_vbri` |MP3 frame Fraunhofer encoder variable bitrate tag ||
|`mp3_frame_xing` |MP3 frame Xing/Info tag ||
|[`mp4`](#mp4) |ISOBMFF, QuickTime and similar |`aac_frame` `av1_ccr` `av1_frame` `avc_au` `avc_dcr` `flac_frame` `flac_metadatablocks` `hevc_au` `hevc_dcr` `icc_profile` `id3v2` `image` `jpeg` `mp3_frame` `mpeg_es` `mpeg_pes_packet` `opus_packet` `png` `prores_frame` `protobuf_widevine` `pssh_playready` `vorbis_packet` `vp9_frame` `vpx_ccr`|
|`mpeg_asc` |MPEG-4 Audio Specific Config ||
|`mpeg_es` |MPEG Elementary Stream |`mpeg_asc` `vorbis_packet`|
|`mpeg_pes` |MPEG Packetized elementary stream |`mpeg_pes_packet` `mpeg_spu`|
|`mpeg_pes_packet` |MPEG Packetized elementary stream packet ||
|`mpeg_spu` |Sub Picture Unit (DVD subtitle) ||
|`mpeg_ts` |MPEG Transport Stream ||
|[`msgpack`](#msgpack) |MessagePack ||
|`ogg` |OGG file |`ogg_page` `vorbis_packet` `opus_packet` `flac_metadatablock` `flac_frame`|
|`ogg_page` |OGG page ||
|[`opentimestamps`](#opentimestamps) |OpenTimestamps file ||
|`opus_packet` |Opus packet |`vorbis_comment`|
|[`pcap`](#pcap) |PCAP packet capture |`link_frame` `tcp_stream` `ipv4_packet`|
|`pcapng` |PCAPNG packet capture |`link_frame` `tcp_stream` `ipv4_packet`|
|[`pg_btree`](#pg_btree) |PostgreSQL btree index file ||
|[`pg_control`](#pg_control) |PostgreSQL control file ||
|[`pg_heap`](#pg_heap) |PostgreSQL heap file ||
|`png` |Portable Network Graphics file |`icc_profile` `exif`|
|`prores_frame` |Apple ProRes frame ||
|[`protobuf`](#protobuf) |Protobuf ||
|`protobuf_widevine` |Widevine protobuf |`protobuf`|
|`pssh_playready` |PlayReady PSSH ||
|[`rtmp`](#rtmp) |Real-Time Messaging Protocol |`amf0` `mpeg_asc`|
|`sll2_packet` |Linux cooked capture encapsulation v2 |`inet_packet`|
|`sll_packet` |Linux cooked capture encapsulation |`inet_packet`|
|`tar` |Tar archive |`probe`|
|`tcp_segment` |Transmission control protocol segment ||
|`tiff` |Tag Image File Format |`icc_profile`|
|[`tls`](#tls) |Transport layer security |`asn1_ber`|
|`toml` |Tom's Obvious, Minimal Language ||
|[`tzif`](#tzif) |Time Zone Information Format ||
|`udp_datagram` |User datagram protocol |`udp_payload`|
|`vorbis_comment` |Vorbis comment |`flac_picture`|
|`vorbis_packet` |Vorbis packet |`vorbis_comment`|
|`vp8_frame` |VP8 frame ||
|`vp9_cfm` |VP9 Codec Feature Metadata ||
|`vp9_frame` |VP9 frame ||
|`vpx_ccr` |VPX Codec Configuration Record ||
|[`wasm`](#wasm) |WebAssembly Binary Format ||
|`wav` |WAV file |`id3v2` `id3v1` `id3v11`|
|`webp` |WebP image |`exif` `vp8_frame` `icc_profile` `xml`|
|[`xml`](#xml) |Extensible Markup Language ||
|`yaml` |YAML Ain't Markup Language ||
|[`zip`](#zip) |ZIP archive |`probe`|
|`image` |Group |`gif` `jpeg` `mp4` `png` `tiff` `webp`|
|`inet_packet` |Group |`ipv4_packet` `ipv6_packet`|
|`ip_packet` |Group |`icmp` `icmpv6` `tcp_segment` `udp_datagram`|
|`link_frame` |Group |`bsd_loopback_frame` `ether8023_frame` `ipv4_packet` `ipv6_packet` `sll2_packet` `sll_packet`|
|`mp3_frame_tags` |Group |`mp3_frame_vbri` `mp3_frame_xing`|
|`probe` |Group |`adts` `aiff` `apple_bookmark` `ar` `avi` `avro_ocf` `bitcoin_blkdat` `bplist` `bzip2` `caff` `elf` `flac` `gif` `gzip` `html` `jpeg` `json` `jsonl` `leveldb_table` `luajit` `macho` `macho_fat` `matroska` `moc3` `mp3` `mp4` `mpeg_ts` `ogg` `opentimestamps` `pcap` `pcapng` `png` `tar` `tiff` `toml` `tzif` `wasm` `wav` `webp` `xml` `yaml` `zip`|
|`tcp_stream` |Group |`dns_tcp` `rtmp` `tls`|
|`udp_payload` |Group |`dns`|
[#]: sh-end
## Global format options
Currently the only global option is `force` and is used to ignore some format assertion errors. It can be used as a decode option or as a CLI `-o` option:
```
fq -d mp4 -o force=true file.mp4
fq -d bytes 'mp4({force: true})' file.mp4
```
## Format details
[fq -rn -L . 'include "formats"; formats_sections']: sh-start
## aac_frame
### Options
|Name |Default|Description|
|- |- |-|
|`object_type`|1 |Audio object type|
### Examples
Decode file using aac_frame options
```
$ fq -d aac_frame -o object_type=1 . file
```
Decode value as aac_frame
```
... | aac_frame({object_type:1})
```
## apple_bookmark
Apple's `bookmarkData` format is used to encode information that can be resolved
into a `URL` object for a file even if the user moves or renames it. Can also
contain security scoping information for App Sandbox support.
These `bookmarkData` blobs are often found endcoded in data fields of Binary
Property Lists. Notable examples include:
- `com.apple.finder.plist` - contains an `FXRecentFolders` value, which is an
array of ten objects, each of which consists of a `name` and `file-bookmark`
field, which is a `bookmarkData` object for each recently accessed folder
location.
- `com.apple.LSSharedFileList.RecentApplications.sfl2` - `sfl2` files are
actually `plist` files of the `NSKeyedArchiver` format. They can be parsed the
same as `plist` files, but they have a more complicated tree-like structure
than would typically be found, which can make locating and retrieving specific
values difficult, even once it has been converted to a JSON representation.
For more information about these types of files, see Sarah Edwards' excellent
research on the subject (link in references).
`fq`'s `grep_by` function can be used to recursively descend through the decoded
tree, probing for and selecting any `bookmark` blobs, then converting them to
readable JSON with `torepr`:
```
fq 'grep_by(.type=="data" and .value[0:4] == "book") | .value | apple_bookmark |
torepr'
```
### Authors
- David McDonald
[@dgmcdona](https://github.com/dgmcdona)
[@river_rat_504](https://twitter.com/river_rat_504)
### References
- https://developer.apple.com/documentation/foundation/url/2143023-bookmarkdata
- https://mac-alias.readthedocs.io/en/latest/bookmark_fmt.html
- https://www.mac4n6.com/blog/2016/1/1/manual-analysis-of-nskeyedarchiver-formatted-plist-files-a-review-of-the-new-os-x-1011-recent-items
- https://michaellynn.github.io/2015/10/24/apples-bookmarkdata-exposed/
## asn1_ber
Supports decoding BER, CER and DER (X.690).
- Currently no extra validation is done for CER and DER.
- Does not support specifying a schema.
- Supports `torepr` but without schema all sequences and sets will be arrays.
### Can be used to decode certificates etc
```sh
$ fq -d bytes 'from_pem | asn1_ber | d' cert.pem
```
### Can decode nested values
```sh
$ fq -d asn1_ber '.constructed[1].value | asn1_ber' file.ber
```
### Manual schema
```sh
$ fq -d asn1_ber 'torepr as $r | ["version", "modulus", "private_exponent", "private_exponen", "prime1", "prime2", "exponent1", "exponent2", "coefficient"] | with_entries({key: .value, value: $r[.key]})' pkcs1.der
```
### References
- https://www.itu.int/ITU-T/studygroups/com10/languages/X.690_1297.pdf
- https://en.wikipedia.org/wiki/X.690
- https://letsencrypt.org/docs/a-warm-welcome-to-asn1-and-der/
- https://lapo.it/asn1js/
## avc_au
### Options
|Name |Default|Description|
|- |- |-|
|`length_size`|0 |Length value size|
### Examples
Decode file using avc_au options
```
$ fq -d avc_au -o length_size=0 . file
```
Decode value as avc_au
```
... | avc_au({length_size:0})
```
## avi
### Options
|Name |Default|Description|
|- |- |-|
|`decode_extended_chunks`|true |Decode extended chunks|
|`decode_samples` |true |Decode samples|
### Examples
Decode file using avi options
```
$ fq -d avi -o decode_extended_chunks=true -o decode_samples=true . file
```
Decode value as avi
```
... | avi({decode_extended_chunks:true,decode_samples:true})
```
### Samples
AVI has many redundant ways to index samples so currently `.streams[].samples` will only include samples the most "modern" way used in the file. That is in order of stream super index, movi ix index then idx1 index.
### Extract samples for stream 1
```sh
$ fq '.streams[1].samples[] | tobytes' file.avi > stream01.mp3
```
### Show stream summary
```sh
$ fq -o decode_samples=false '[.chunks[0] | grep_by(.id=="LIST" and .type=="strl") | grep_by(.id=="strh") as {$type} | grep_by(.id=="strf") as {$format_tag, $compression} | {$type,$format_tag,$compression}]' *.avi
```
### Speed up decoding by disabling sample and extended chunks decoding
If your not interested in sample details or extended chunks you can speed up decoding by using:
```sh
$ fq -o decode_samples=false -o decode_extended_chunks=false d file.avi
```
### References
- [AVI RIFF File Reference](https://learn.microsoft.com/en-us/windows/win32/directshow/avi-riff-file-reference)
- [OpenDML AVI File Format Extensions](http://www.jmcgowan.com/odmlff2.pdf)
## avro_ocf
Supports reading Avro Object Container Format (OCF) files based on the 1.11.0 specification.
Capable of handling null, deflate, and snappy codecs for data compression.
Limitations:
- Schema does not support self-referential types, only built-in types.
- Decimal logical types are not supported for decoding, will just be treated as their primitive type
### References
- https://avro.apache.org/docs/current/spec.html#Object+Container+Files
### Authors
- Xentripetal
xentripetal@fastmail.com
[@xentripetal](https://github.com/xentripetal)
## bencode
### Convert represented value to JSON
```
$ fq -d bencode torepr file.torrent
```
### References
- https://wiki.theory.org/BitTorrentSpecification#Bencoding
## bitcoin_block
### Options
|Name |Default|Description|
|- |- |-|
|`has_header`|false |Has blkdat header|
### Examples
Decode file using bitcoin_block options
```
$ fq -d bitcoin_block -o has_header=false . file
```
Decode value as bitcoin_block
```
... | bitcoin_block({has_header:false})
```
## bits
Decode to a slice and indexable binary of bits.
### Slice and decode bit range
```sh
$ echo 'some {"a":1} json' | fq -d bits '.[40:-48] | fromjson'
{
"a": 1
}
```
## Index bits
```sh
✗ echo 'hello' | fq -d bits '.[4]'
1
$ echo 'hello' | fq -c -d bits '[.[range(8)]]'
[0,1,1,0,1,0,0,0]
```
## bplist
### Show full decoding
```sh
$ fq d Info.plist
```
### Timestamps
Timestamps in Apple Binary Property Lists are encoded as Cocoa Core Data
timestamps, where the raw value is the floating point number of seconds since
January 1, 2001. By default, `fq` will render the raw floating point value. In
order to get the raw value or string description, use the `todescription`
function, you can use the `tovalue` and `todescription` functions:
```sh
$ fq 'torepr.SomeTimeStamp | tovalue' Info.plist
685135328
$ fq 'torepr.SomeTimeStamp | todescription' Info.plist
"2022-09-17T19:22:08Z"
```
### Get JSON representation
`bplist` files can be converted to a JSON representation using the `torepr` filter:
```sh
$ fq torepr com.apple.UIAutomation.plist
{
"UIAutomationEnabled": true
}
```
### Decoding NSKeyedArchiver serialized objects
A common way that Swift and Objective-C libraries on macOS serialize objects
is through the NSKeyedArchiver API, which flattens objects into a list of elements
and class descriptions that are reconstructed into an object graph using CFUID
elements in the property list. `fq` includes a function, `from_ns_keyed_archiver`,
which will rebuild this object graph into a friendly representation.
If no parameters are supplied, it will assume that there is a CFUID located at
`."$top".root` that specifies the root from which decoding should occur. If this
is not present, an error will be produced, asking the user to specify a root
object in the `.$objects` list from which to decode.
The following examples show how this might be used (in this case, within the `fq` REPL):
```
# Assume $top.root is present
bplist> from_ns_keyed_archiver
# Specify optional root
bplist> from_ns_keyed_archiver(1)
```
### Authors
- David McDonald
[@dgmcdona](https://github.com/dgmcdona)
### References
- http://fileformats.archiveteam.org/wiki/Property_List/Binary
- https://medium.com/@karaiskc/understanding-apples-binary-property-list-format-281e6da00dbd
- https://opensource.apple.com/source/CF/CF-550/CFBinaryPList.c
## bson
### Limitations
- The decimal128 type is not supported for decoding, will just be treated as binary
### Convert represented value to JSON
```
$ fq -d bson torepr file.bson
```
### Filter represented value
```
$ fq -d bson 'torepr | select(.name=="bob")' file.bson
```
### Authors
- Mattias Wadman mattias.wadman@gmail.com, original author
- Matt Dale [@matthewdale](https://github.com/matthewdale), additional types and bug fixes
### References
- https://bsonspec.org/spec.html
## bytes
Decode to a slice and indexable binary of bytes.
### Slice out byte ranges
```sh
$ echo -n 'hello' | fq -d bytes '.[-3:]' > last_3_bytes
$ echo -n 'hello' | fq -d bytes '[.[-2:], .[0:2]] | tobytes' > first_last_2_bytes_swapped
```
### Slice and decode byte range
```sh
$ echo 'some {"a":1} json' | fq -d bytes '.[5:-6] | fromjson'
{
"a": 1
}
```
## Index bytes
```sh
$ echo 'hello' | fq -d bytes '.[1]'
101
```
## caff
### Options
|Name |Default|Description|
|- |- |-|
|`uncompress`|true |Uncompress and probe files|
### Examples
Decode file using caff options
```
$ fq -d caff -o uncompress=true . file
```
Decode value as caff
```
... | caff({uncompress:true})
```
### Authors
- [@ronsor](https://github.com/ronsor)
## cbor
### Convert represented value to JSON
```
$ fq -d cbor torepr file.cbor
```
### References
- https://en.wikipedia.org/wiki/CBOR
- https://www.rfc-editor.org/rfc/rfc8949.html
## csv
### Options
|Name |Default|Description|
|- |- |-|
|`comma` |, |Separator character|
|`comment`|# |Comment line character|
### Examples
Decode file using csv options
```
$ fq -d csv -o comma="," -o comment="#" . file
```
Decode value as csv
```
... | csv({comma:",",comment:"#"})
```
### TSV to CSV
```sh
$ fq -d csv -o comma="\t" to_csv file.tsv
```
### Convert rows to objects based on header row
```sh
$ fq -d csv '.[0] as $t | .[1:] | map(with_entries(.key = $t[.key]))' file.csv
```
## flac_frame
### Options
|Name |Default|Description|
|- |- |-|
|`bits_per_sample`|16 |Bits per sample|
### Examples
Decode file using flac_frame options
```
$ fq -d flac_frame -o bits_per_sample=16 . file
```
Decode value as flac_frame
```
... | flac_frame({bits_per_sample:16})
```
## hevc_au
### Options
|Name |Default|Description|
|- |- |-|
|`length_size`|4 |Length value size|
### Examples
Decode file using hevc_au options
```
$ fq -d hevc_au -o length_size=4 . file
```
Decode value as hevc_au
```
... | hevc_au({length_size:4})
```
## html
### Options
|Name |Default|Description|
|- |- |-|
|`array` |false |Decode as nested arrays|
|`attribute_prefix`|@ |Prefix for attribute keys|
|`seq` |false |Use seq attribute to preserve element order|
### Examples
Decode file using html options
```
$ fq -d html -o array=false -o attribute_prefix="@" -o seq=false . file
```
Decode value as html
```
... | html({array:false,attribute_prefix:"@",seq:false})
```
HTML is decoded in HTML5 mode and will always include ``, `` and `` element.
See xml format for more examples and how to preserve element order and how to encode to xml.
There is no `to_html` function, see `to_xml` instead.
### Element as object
```sh
# decode as object is the default
$ echo 'text' | fq -d html
{
"html": {
"body": {
"a": {
"#text": "text",
"@href": "url"
}
},
"head": ""
}
}
```
### Element as array
```sh
$ 'text' | fq -d html -o array=true
[
"html",
null,
[
[
"head",
null,
[]
],
[
"body",
null,
[
[
"a",
{
"#text": "text",
"href": "url"
},
[]
]
]
]
]
]
# decode html files to a {file: "title", ...} object
$ fq -n -d html '[inputs | {key: input_filename, value: .html.head.title?}] | from_entries' *.html
# href:s in file
$ fq -r -o array=true -d html '.. | select(.[0] == "a" and .[1].href)?.[1].href' file.html
```
## leveldb_descriptor
### Limitations
- fragmented non-"full" records are not merged and decoded further.
### Authors
- [@mikez](https://github.com/mikez), original author
### References
- https://github.com/google/leveldb/blob/main/doc/impl.md#manifest
- https://github.com/google/leveldb/blob/main/db/version_edit.cc
## leveldb_log
### Limitations
- individual record contents are not merged nor decoded further.
### Authors
- [@mikez](https://github.com/mikez), original author
### References
- https://github.com/google/leveldb/blob/main/doc/log_format.md
## leveldb_table
### Limitations
- no Meta Blocks (like "filter") are decoded yet.
- Zstandard uncompression is not implemented yet.
### Authors
- [@mikez](https://github.com/mikez), original author
### References
- https://github.com/google/leveldb/blob/main/doc/table_format.md
- https://github.com/google/leveldb/blob/main/doc/impl.md
- https://github.com/google/leveldb/blob/main/doc/index.md
## luajit
### Authors
- [@dlatchx](https://github.com/dlatchx)
### References
- https://github.com/LuaJIT/LuaJIT/blob/v2.1/src/lj_bcdump.h
- http://scm.zoomquiet.top/data/20131216145900/index.html
## macho
Supports decoding vanilla and FAT Mach-O binaries.
### Select 64bit load segments
```sh
$ fq '.load_commands[] | select(.cmd=="segment_64")' file
```
### References
- https://github.com/aidansteele/osx-abi-macho-file-format-reference
### Authors
- Sıddık AÇIL
acils@itu.edu.tr
[@Akaame](https://github.com/Akaame)
## markdown
### Array with all level 1 and 2 headers
```sh
$ fq -d markdown '[.. | select(.type=="heading" and .level<=2)?.children[0]]' file.md
```
## matroska
### Options
|Name |Default|Description|
|- |- |-|
|`decode_samples`|true |Decode samples|
### Examples
Decode file using matroska options
```
$ fq -d matroska -o decode_samples=true . file
```
Decode value as matroska
```
... | matroska({decode_samples:true})
```
### Lookup element using path
```sh
$ fq 'matroska_path(".Segment.Tracks[0)")' file.mkv
```
### Get path to element
```sh
$ fq 'grep_by(.id == "Tracks") | matroska_path' file.mkv
```
### References
- https://tools.ietf.org/html/draft-ietf-cellar-ebml-00
- https://matroska.org/technical/specs/index.html
- https://www.matroska.org/technical/basics.html
- https://www.matroska.org/technical/codec_specs.html
- https://wiki.xiph.org/MatroskaOpus
## moc3
### Authors
- [@ronsor](https://github.com/ronsor)
## mp3
### Options
|Name |Default|Description|
|- |- |-|
|`max_sync_seek` |32768 |Max byte distance to next sync|
|`max_unique_header_configs`|5 |Max number of unique frame header configs allowed|
|`max_unknown` |50 |Max percent (0-100) unknown bits|
### Examples
Decode file using mp3 options
```
$ fq -d mp3 -o max_sync_seek=32768 -o max_unique_header_configs=5 -o max_unknown=50 . file
```
Decode value as mp3
```
... | mp3({max_sync_seek:32768,max_unique_header_configs:5,max_unknown:50})
```
## mp4
### Options
|Name |Default|Description|
|- |- |-|
|`allow_truncated`|false |Allow box to be truncated|
|`decode_samples` |true |Decode samples|
### Examples
Decode file using mp4 options
```
$ fq -d mp4 -o allow_truncated=false -o decode_samples=true . file
```
Decode value as mp4
```
... | mp4({allow_truncated:false,decode_samples:true})
```
### Speed up decoding by not decoding samples
```sh
# manually decode first sample as a aac_frame
$ fq -o decode_samples=false '.tracks[0].samples[0] | aac_frame | d' file.mp4
```
### Entries for first edit list as values
```sh
$ fq 'first(grep_by(.type=="elst").entries) | tovalue' file.mp4
```
### Whole box tree as JSON (exclude mdat data and tracks)
```sh
$ fq 'del(.tracks) | grep_by(.type=="mdat").data = "" | tovalue' file.mp4
```
### Force decode a single box
```sh
$ fq -n '"AAAAHGVsc3QAAAAAAAAAAQAAADIAAAQAAAEAAA==" | from_base64 | mp4({force:true}) | d'
```
### Lookup mp4 box using a mp4 box path.
```sh
# | mp4_path($path) ->
$ fq 'mp4_path(".moov.trak[1]")' file.mp4
```
### Get mp4 box path for a decode value box.
```sh
# | mp4_path -> string
$ fq 'grep_by(.type == "trak") | mp4_path' file.mp4
```
### References
- [ISO/IEC base media file format (MPEG-4 Part 12)](https://en.wikipedia.org/wiki/ISO/IEC_base_media_file_format)
- [Quicktime file format](https://developer.apple.com/standards/qtff-2001.pdf)
## msgpack
### Convert represented value to JSON
```
$ fq -d msgpack torepr file.msgpack
```
### References
- https://github.com/msgpack/msgpack/blob/master/spec.md
## opentimestamps
### View a full OpenTimestamps file
```
$ fq dd file.ots
```
### List the names of the Calendar servers used
```
$ fq '.operations | map(select(.attestation_type == "calendar") | .url)' file.ots
```
### Check if there are Bitcoin attestations present
```
$ fq '.operations | map(select(.attestation_type == "bitcoin")) | length > 0' file.ots
```
### Authors
- fiatjaf, https://fiatjaf.com
### References
- https://opentimestamps.org/
- https://github.com/opentimestamps/python-opentimestamps
## pcap
### Build object with number of (reassembled) TCP bytes sent to/from client IP
```sh
# for a pcapng file you would use .[0].tcp_connections for first section
$ fq '.tcp_connections | group_by(.client.ip) | map({key: .[0].client.ip, value: map(.client.stream, .server.stream | tobytes.size) | add}) | from_entries'
{
"10.1.0.22": 15116,
"10.99.12.136": 234,
"10.99.12.150": 218
}
```
## pg_btree
### Options
|Name |Default|Description|
|- |- |-|
|`page`|0 |First page number in file, default is 0|
### Examples
Decode file using pg_btree options
```
$ fq -d pg_btree -o page=0 . file
```
Decode value as pg_btree
```
... | pg_btree({page:0})
```
### Btree index meta page
```sh
$ fq -d pg_btree -o flavour=postgres14 ".[0] | d" 16404
```
### Btree index page
```sh
$ fq -d pg_btree -o flavour=postgres14 ".[1]" 16404
```
### Authors
- Pavel Safonov
p.n.safonov@gmail.com
[@pnsafonov](https://github.com/pnsafonov)
### References
- https://www.postgresql.org/docs/current/storage-page-layout.html
## pg_control
### Options
|Name |Default|Description|
|- |- |-|
|`flavour`| |PostgreSQL flavour: postgres14, pgproee14.., postgres10|
### Examples
Decode file using pg_control options
```
$ fq -d pg_control -o flavour="" . file
```
Decode value as pg_control
```
... | pg_control({flavour:""})
```
### Decode content of pg_control file
```sh
$ fq -d pg_control -o flavour=postgres14 d pg_control
```
### Specific fields can be got by request
```sh
$ fq -d pg_control -o flavour=postgres14 ".state, .check_point_copy.redo, .wal_level" pg_control
```
### Authors
- Pavel Safonov
p.n.safonov@gmail.com
[@pnsafonov](https://github.com/pnsafonov)
### References
- https://github.com/postgres/postgres/blob/REL_14_2/src/include/catalog/pg_control.h
## pg_heap
### Options
|Name |Default |Description|
|- |- |-|
|`flavour`|postgres14|PostgreSQL flavour: postgres14, pgproee14.., postgres10|
|`page` |0 |First page number in file, default is 0|
|`segment`|0 |Segment file number (16790.1 is 1), default is 0|
### Examples
Decode file using pg_heap options
```
$ fq -d pg_heap -o flavour="postgres14" -o page=0 -o segment=0 . file
```
Decode value as pg_heap
```
... | pg_heap({flavour:"postgres14",page:0,segment:0})
```
### To see heap page's content
```sh
$ fq -d pg_heap -o flavour=postgres14 ".[0]" 16994
```
### To see page's header
```sh
$ fq -d pg_heap -o flavour=postgres14 ".[0].page_header" 16994
```
### First and last item pointers on first page
```sh
$ fq -d pg_heap -o flavour=postgres14 ".[0].pd_linp[0, -1]" 16994
```
### First and last tuple on first page
```sh
$ fq -d pg_heap -o flavour=postgres14 ".[0].tuples[0, -1]" 16994
```
### Authors
- Pavel Safonov
p.n.safonov@gmail.com
[@pnsafonov](https://github.com/pnsafonov)
### References
- https://www.postgresql.org/docs/current/storage-page-layout.html
## protobuf
### Can decode sub messages
```sh
$ fq -d protobuf '.fields[6].wire_value | protobuf | d' file
```
### References
- https://developers.google.com/protocol-buffers/docs/encoding
## rtmp
Current only supports plain RTMP (not RTMPT or encrypted variants etc) with AMF0 (not AMF3).
### Show rtmp streams in PCAP file
```sh
fq '.tcp_connections[] | select(.server.port=="rtmp") | d' file.cap
```
### References
- https://rtmp.veriskope.com/docs/spec/
- https://rtmp.veriskope.com/pdf/video_file_format_spec_v10.pdf
## tls
### Options
|Name |Default|Description|
|- |- |-|
|`keylog`| |NSS Key Log content|
### Examples
Decode file using tls options
```
$ fq -d tls -o keylog="" . file
```
Decode value as tls
```
... | tls({keylog:""})
```
Supports decoding of most standard records, messages and extensions. Can also decrypt most standard cipher suits in a PCAP with traffic in both directions if a NSS key log is provided.
### Decode and decrypt provding a PCAP and key log
Write traffic to a PCAP file:
```sh
$ tcpdump -i -w traffic.pcap
```
Make sure your curl TLS backend support `SSLKEYLOGFILE` and do:
```sh
$ SSLKEYLOGFILE=traffic.keylog curl --tls-max 1.2 https://host/path
```
Decode, decrypt and query. Uses `keylog=@` to read option value from keylog file:
```sh
# decode and show whole tree
$ fq -o keylog=@traffic.keylog d traffic.pcap
# write unencrypted server response to a file.
# first .stream is the TCP stream, second .stream is TLS application data stream
#
# first TCP connections:
$ fq -o keylog=@traffic.keylog '.tcp_connections[0].server.stream.stream | tobytes' traffic.pcap > data
# first TLS connection:
$ fq -o keylog=@traffic.keylog 'first(grep_by(.server.stream | format == "tls")).server.stream.stream | tobytes' > data
```
### Supported cipher suites for decryption
`TLS_DH_ANON_EXPORT_WITH_DES40_CBC_SHA`,
`TLS_DH_ANON_EXPORT_WITH_RC4_40_MD5`,
`TLS_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA`,
`TLS_DHE_DSS_WITH_3DES_EDE_CBC_SHA`,
`TLS_DHE_DSS_WITH_AES_128_CBC_SHA`,
`TLS_DHE_DSS_WITH_AES_128_CBC_SHA256`,
`TLS_DHE_DSS_WITH_AES_128_GCM_SHA256`,
`TLS_DHE_DSS_WITH_AES_256_CBC_SHA`,
`TLS_DHE_DSS_WITH_AES_256_CBC_SHA256`,
`TLS_DHE_DSS_WITH_AES_256_GCM_SHA384`,
`TLS_DHE_DSS_WITH_DES_CBC_SHA`,
`TLS_DHE_DSS_WITH_RC4_128_SHA`,
`TLS_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA`,
`TLS_DHE_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_DHE_RSA_WITH_AES_128_CBC_SHA`,
`TLS_DHE_RSA_WITH_AES_128_CBC_SHA256`,
`TLS_DHE_RSA_WITH_AES_128_GCM_SHA256`,
`TLS_DHE_RSA_WITH_AES_256_CBC_SHA`,
`TLS_DHE_RSA_WITH_AES_256_CBC_SHA256`,
`TLS_DHE_RSA_WITH_AES_256_GCM_SHA384`,
`TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256`,
`TLS_DHE_RSA_WITH_DES_CBC_SHA`,
`TLS_ECDH_ECDSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA256`,
`TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256`,
`TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA384`,
`TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384`,
`TLS_ECDH_ECDSA_WITH_RC4_128_SHA`,
`TLS_ECDH_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_ECDH_RSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDH_RSA_WITH_AES_128_CBC_SHA256`,
`TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256`,
`TLS_ECDH_RSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDH_RSA_WITH_AES_256_CBC_SHA384`,
`TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384`,
`TLS_ECDH_RSA_WITH_RC4_128_SHA`,
`TLS_ECDHE_ECDSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256`,
`TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256`,
`TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384`,
`TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384eadAESGCM`,
`TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256`,
`TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305`,
`TLS_ECDHE_ECDSA_WITH_RC4_128_SHA`,
`TLS_ECDHE_ECDSA_WITH_RC4_128_SHA`,
`TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA`,
`TLS_ECDHE_PSK_WITH_AES_128_GCM_SHA256`,
`TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256`,
`TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256`,
`TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA`,
`TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384`,
`TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384`,
`TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256`,
`TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305`,
`TLS_ECDHE_RSA_WITH_RC4_128_SHA`,
`TLS_ECDHE_RSA_WITH_RC4_128_SHA`,
`TLS_PSK_WITH_AES_128_CBC_SHA`,
`TLS_PSK_WITH_AES_256_CBC_SHA`,
`TLS_PSK_WITH_RC4_128_SHA`,
`TLS_RSA_EXPORT_WITH_DES40_CBC_SHA`,
`TLS_RSA_EXPORT_WITH_RC4_40_MD5`,
`TLS_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_RSA_WITH_3DES_EDE_CBC_SHA`,
`TLS_RSA_WITH_AES_128_CBC_SHA`,
`TLS_RSA_WITH_AES_128_CBC_SHA`,
`TLS_RSA_WITH_AES_128_CBC_SHA256`,
`TLS_RSA_WITH_AES_128_CBC_SHA256`,
`TLS_RSA_WITH_AES_128_GCM_SHA256`,
`TLS_RSA_WITH_AES_128_GCM_SHA256`,
`TLS_RSA_WITH_AES_256_CBC_SHA`,
`TLS_RSA_WITH_AES_256_CBC_SHA`,
`TLS_RSA_WITH_AES_256_CBC_SHA256`,
`TLS_RSA_WITH_AES_256_GCM_SHA384`,
`TLS_RSA_WITH_AES_256_GCM_SHA384`,
`TLS_RSA_WITH_DES_CBC_SHA`,
`TLS_RSA_WITH_RC4_128_MD5`,
`TLS_RSA_WITH_RC4_128_SHA`,
`TLS_RSA_WITH_RC4_128_SHA`
### References
- [RFC 5246: The Transport Layer Security (TLS) Protocol](https://www.rfc-editor.org/rfc/rfc5246)
- [RFC 6101: The Secure Sockets Layer (SSL) Protocol Version 3.0](https://www.rfc-editor.org/rfc/rfc)
## tzif
### Get last transition time
```sh
fq '.v2plusdatablock.transition_times[-1] | tovalue' tziffile
```
### Count leap second records
```sh
fq '.v2plusdatablock.leap_second_records | length' tziffile
```
### Authors
- Takashi Oguma
[@bitbears-dev](https://github.com/bitbears-dev)
[@0xb17bea125](https://twitter.com/0xb17bea125)
### References
- https://datatracker.ietf.org/doc/html/rfc8536
## wasm
### Count opcode usage
```sh
$ fq '.sections[] | select(.id == "code_section") | [.. | .opcode? // empty] | count | map({key: .[0], value: .[1]}) | from_entries' file.wasm
```
### List exports and imports
```sh
$ fq '.sections | {import: map(select(.id == "import_section").content.im.x[].nm.b), export: map(select(.id == "export_section").content.ex.x[].nm.b)}' file.wasm
```
### Authors
- Takashi Oguma
[@bitbears-dev](https://github.com/bitbears-dev)
[@0xb17bea125](https://twitter.com/0xb17bea125)
### References
- https://webassembly.github.io/spec/core/
## xml
### Options
|Name |Default|Description|
|- |- |-|
|`array` |false |Decode as nested arrays|
|`attribute_prefix`|@ |Prefix for attribute keys|
|`seq` |false |Use seq attribute to preserve element order|
### Examples
Decode file using xml options
```
$ fq -d xml -o array=false -o attribute_prefix="@" -o seq=false . file
```
Decode value as xml
```
... | xml({array:false,attribute_prefix:"@",seq:false})
```
XML can be decoded and encoded into jq values in two ways, elements as object or array.
Which variant to use depends a bit what you want to do. The object variant might be easier
to query for a specific value but array might be easier to use to generate xml or to query
after all elements of some kind etc.
Encoding is done using the `to_xml` function and it will figure what variant that is used based on the input value.
Is has two optional options `indent` and `attribute_prefix`.
### Elements as object
Element can have different shapes depending on body text, attributes and children:
- `text` is `{"a":{"#text":"text","@key":"value"}}`, has text (`#text`) and attributes (`@key`)
- `text` is `{"a":"text"}`
- `text` is `{"a":{"b":"text"}}` one child with only text and no attributes
- `text` is `{"a":{"b":["","text"]}}` two children with same name end up in an array
- `text` is `{"a":{"b":["",{"#text":"text","@key":"value"}]}}`
If there is `#seq` attribute it encodes the child element order. Use `-o seq=true` to include sequence number when decoding,
otherwise order might be lost.
```sh
# decode as object is the default
$ echo 'bbbccc' | fq -d xml -o seq=true
{
"a": {
"b": [
{
"#seq": 0
},
{
"#seq": 1,
"#text": "bbb"
}
],
"c": {
"#seq": 2,
"#text": "ccc",
"@attr": "value"
}
}
}
# access text of the element
$ echo 'bbbccc' | fq '.a.c["#text"]'
"ccc"
# decode to object and encode to xml
$ echo 'bbbccc' | fq -r -d xml -o seq=true 'to_xml({indent:2})'
bbb
ccc
```
### Elements as array
Elements are arrays of the shape `["#text": "body text", "attr_name", {key: "attr value"}|null, [, ...]]`.
```sh
# decode as array
$ echo 'bbbccc' | fq -d xml -o array=true
[
"a",
null,
[
[
"b",
null,
[]
],
[
"b",
{
"#text": "bbb"
},
[]
],
[
"c",
{
"#text": "ccc",
"attr": "value"
},
[]
]
]
]
# decode to array and encode to xml
$ echo 'bbbccc' | fq -r -d xml -o array=true -o seq=true 'to_xml({indent:2})'
bbb
ccc
# access text of the element, the object variant above is probably easier to use
$ echo 'bbbccc' | fq -o array=true '.[2][2][1]["#text"]'
"ccc"
```
### References
- [xml.com's Converting Between XML and JSON](https://www.xml.com/pub/a/2006/05/31/converting-between-xml-and-json.html)
## zip
### Options
|Name |Default|Description|
|- |- |-|
|`uncompress`|true |Uncompress and probe files|
### Examples
Decode file using zip options
```
$ fq -d zip -o uncompress=true . file
```
Decode value as zip
```
... | zip({uncompress:true})
```
Supports ZIP64.
## Timestamp and time zones
The timestamp accessed via `.local_files[].last_modification` is encoded in ZIP files using [MS-DOS representation](https://learn.microsoft.com/en-us/windows/win32/api/oleauto/nf-oleauto-dosdatetimetovarianttime) which lacks a known time zone. Probably the local time/date was used at creation. The `unix_guess` field in `last_modification` is a guess assuming the local time zone was UTC at creation.
### References
- https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT
- https://opensource.apple.com/source/zip/zip-6/unzip/unzip/proginfo/extra.fld
- https://formats.kaitai.io/dos_datetime/
- https://learn.microsoft.com/en-us/windows/win32/api/oleauto/nf-oleauto-dosdatetimetovarianttime
[#]: sh-end
## Dependency graph
![alt text](formats.svg "Format diagram")