.TH "dcm2json" 1 "Wed Jan 14 2026 06:47:06" "Version 3.7.0" "OFFIS DCMTK" \" -*- nroff -*- .nh .SH NAME dcm2json \- Convert DICOM file and data set to JSON .PP .SH "SYNOPSIS" .PP .PP .nf dcm2json [options] dcmfile-in [jsonfile-out] .fi .PP .SH "DESCRIPTION" .PP The \fBdcm2json\fP utility converts the contents of a DICOM file (file format or raw data set) to JSON (JavaScript Object Notation)\&. The output refers to the "DICOM JSON Model", which is found in DICOM Part 18 Section F\&. .PP If \fBdcm2json\fP reads a raw data set (DICOM data without a file format meta-header), it will attempt to guess the transfer syntax by examining the first few bytes of the file\&. It is not always possible to correctly guess the transfer syntax and it is better to convert a data set to a file format whenever possible (using the \fBdcmconv\fP utility)\&. It is also possible to use the \fI-f\fP and \fI-t[ieb]\fP options to force \fBdcm2json\fP to read a data set with a particular transfer syntax\&. .SH "PARAMETERS" .PP .PP .nf dcmfile-in DICOM input filename to be converted ("-" for stdin) jsonfile-out JSON output filename (default: stdout) .fi .PP .SH "OPTIONS" .PP .SS "general options" .PP .nf -h --help print this help text and exit --version print version information and exit --arguments print expanded command line arguments -q --quiet quiet mode, print no warnings and errors -v --verbose verbose mode, print processing details -d --debug debug mode, print debug information -ll --log-level [l]evel: string constant (fatal, error, warn, info, debug, trace) use level l for the logger -lc --log-config [f]ilename: string use config file f for the logger .fi .PP .SS "input options" .PP .nf input file format: +f --read-file read file format or data set (default) +fo --read-file-only read file format only -f --read-dataset read data set without file meta information input transfer syntax: -t= --read-xfer-auto use TS recognition (default) -td --read-xfer-detect ignore TS specified in the file meta header -te --read-xfer-little read with explicit VR little endian TS -tb --read-xfer-big read with explicit VR big endian TS -ti --read-xfer-implicit read with implicit VR little endian TS .fi .PP .SS "processing options" .PP .nf encoding of infinity and not-a-number: -es --encode-strict report error for 'inf' and 'nan' (default) -ee --encode-extended permit 'inf' and 'nan' in JSON numbers encoding of IS and DS (integer/decimal string) elements: -ia --is-ds-auto encode as number if valid, as string otherwise (default) -in --is-ds-num always encode as number, fail if invalid -is --is-ds-string always encode as string bulk data URI options: -b --bulk-disabled write everything as inline binary (default) +b --bulk-enabled write large attributes as bulk data +bz --bulk-size [s]ize: integer (default: 1) use bulk data for attributes >= s kBytes +bp --bulk-uri-prefix [u]ri prefix: string use prefix u when generating bulk data URIs (default: file URI) +bd --bulk-dir [d]irectory: string write bulk data files to d (default: '\&.') +bs --bulk-subdir create subdirectory for each SOP instance (default: no subdirectory) .fi .PP .SS "output options" .PP .nf output format: +fc --formatted-code enable whitespace formatting (default) # prints additional spaces and newlines for increased # readability -fc --compact-code print only required characters +m --write-meta write data set with meta information (warning: not conforming to the DICOM standard) .fi .PP .SH "JSON Format" .PP The basic structure of the JSON output created from a DICOM file looks like the following (see DICOM Part 18 Section F for details): .PP .PP .nf { "00080005": { "vr": "CS", "Value": [ "ISO_IR 192" ] }, "00080020": { "vr": "DT", "Value": [ "20130409" ] }, "00080030": { "vr": "TM", "Value": [ "131600\&.0000" ] }, "00080050": { "vr": "SH", "Value": [ "11235813" ] }, "00080056": { "vr": "CS", "Value": [ "ONLINE" ] }, "00080061": { "vr": "CS", "Value": [ "CT", "PET" ] }, "00080090": { "vr": "PN", "Value": [ { "Alphabetic": "^Bob^^Dr\&." } ] }, "00081190": { "vr": "UR", "Value": [ "http://wado\&.nema\&.org/studies/ 1\&.2\&.392\&.200036\&.9116\&.2\&.2\&.2\&.1762893313\&.1029997326\&.945873" ] }, "00090010": { "vr": "LO", "Value": [ "Vendor A" ] }, "00091002": { "vr": "UN", "InlineBinary": "z0x9c8v7" }, "00100010": { "vr": "PN", "Value": [ { "Alphabetic": "Wang^XiaoDong" } ] }, "00100020": { "vr": "LO", "Value": [ "12345" ] }, "00100021": { "vr": "LO", "Value": [ "Hospital A" ] }, "00100030": { "vr": "DA", "Value": [ "19670701" ] }, "00100040": { "vr": "CS", "Value": [ "M" ] }, "00101002": { "vr": "SQ", "Value": [ { "00100020": { "vr": "LO", "Value": [ "54321" ] }, "00100021": { "vr": "LO", "Value": [ "Hospital B" ] } }, { "00100020": { "vr": "LO", "Value": [ "24680" ] }, "00100021": { "vr": "LO", "Value": [ "Hospital C" ] } } ] }, "0020000D": { "vr": "UI", "Value": [ "1\&.2\&.392\&.200036\&.9116\&.2\&.2\&.2\&.1762893313\&.1029997326\&.945873" ] }, "00200010": { "vr": "SH", "Value": [ "11235813" ] }, "00201206": { "vr": "IS", "Value": [ 4 ] }, "00201208": { "vr": "IS", "Value": [ 942 ] } } .fi .PP .SS "Bulk Data" By default, binary data, i\&.e\&. DICOM element values with Value Representations (VR) of OB, OD, OF, OL, OV, OW, and UN values are written as "InlineBinary" (base64 encoding) to the JSON output\&. Option \fI--bulk-enabled\fP causes binary data as well as DS, FD, FL, IS, SV and UV to be replaced by "BulkDataURI" values if the element value is larger than the cut-off threshold (default: 1 kByte)\&. The cut-off threshold can be specified with the \fI--bulk-size\fP option\&. The element values themselves are written as files to the directory given by the \fI--bulk-dir\fP option (default: current directory)\&. The filename is based on a SHA-256 checksum of the element value\&. By default, file URIs are generated that point to the bulk directory\&. For production use, a URI prefix for a WADO-RS service over which the element values can be retrieved should be specified using the \fI--bulk-uri-prefix\fP option\&. This can be implemented by configuring a web server that has read access to \fBdcm2json's\fP bulk directory\&. Finally, the option \fI--bulk-subdir\fP will cause a separate subdirectory to be created (and used in the bulk data URI) for each distinct SOP instance\&. .PP Note that the JSON syntax for the representation of encapsulated pixel data in "InlineBinary" form is unspecified in DICOM, as is the JSON representation of encapsulated multi-frame pixel data\&. These DICOM files cannot be converted to JSON using \fBdcm2json\fP\&. .PP The file name extension for the bulk data files generated can be used to determine the MIME type that should be returned by the WADO-RS server: .PP .PP .nf Extension MIME Type \&.bin application/octet-stream \&.jpeg image/jpeg \&.dicom-rle image/dicom-rle \&.jls image/jls \&.jp2 image/jp2 \&.jpx image/jpx \&.jphc image/jphc \&.jxl image/jxl \&.mpeg video/mpeg \&.mp4 video/mp4 \&.H265 video/H265 .fi .PP .PP Finally, it should be noted that bulk data will be written "as is", i\&.e\&. \fBdcm2json\fP will not attempt to validate or modify the element value in any way\&. For example, \fBdcm2json\fP will not add a JFIF header in the case of JPEG baseline compressed images, which some JPEG viewers may expect\&. .SH "NOTES" .PP .SS "Numbers as Strings" The DICOM standard allows certain numeric DICOM value representations, DS, IS, SV and UV, to be converted either to a JSON number or a JSON string\&. \fBdcm2json\fP converts DS and IS values to JSON numbers if they are valid decimal strings or integer strings, and to strings if they contain any illegal character\&. \fBdcm2json\fP converts SV and UV values to numbers if they are not larger than 9007199254740991ll or smaller than -9007199254740991ll, and to strings otherwise\&. While the JSON specification permits larger numbers, these are the largest integers that JavaScript can handle\&. Therefore, many JSON parsers cannot process larger numbers\&. .SS "Character Encoding" As required by the DICOM JSON encoding, \fBdcm2json\fP always creates output in Unicode UTF-8 encoding and converts DICOM data sets accordingly\&. If this is not possible, for example because DCMTK has been compiled without character set conversion support, an error is returned\&. .SH "LOGGING" .PP The level of logging output of the various command line tools and underlying libraries can be specified by the user\&. By default, only errors and warnings are written to the standard error stream\&. Using option \fI--verbose\fP also informational messages like processing details are reported\&. Option \fI--debug\fP can be used to get more details on the internal activity, e\&.g\&. for debugging purposes\&. Other logging levels can be selected using option \fI--log-level\fP\&. In \fI--quiet\fP mode only fatal errors are reported\&. In such very severe error events, the application will usually terminate\&. For more details on the different logging levels, see documentation of module "oflog"\&. .PP In case the logging output should be written to file (optionally with logfile rotation), to syslog (Unix) or the event log (Windows) option \fI--log-config\fP can be used\&. This configuration file also allows for directing only certain messages to a particular output stream and for filtering certain messages based on the module or application where they are generated\&. An example configuration file is provided in \fI/logger\&.cfg\fP\&. .SH "COMMAND LINE" .PP All command line tools use the following notation for parameters: square brackets enclose optional values (0-1), three trailing dots indicate that multiple values are allowed (1-n), a combination of both means 0 to n values\&. .PP Command line options are distinguished from parameters by a leading '+' or '-' sign, respectively\&. Usually, order and position of command line options are arbitrary (i\&.e\&. they can appear anywhere)\&. However, if options are mutually exclusive the rightmost appearance is used\&. This behavior conforms to the standard evaluation rules of common Unix shells\&. .PP In addition, one or more command files can be specified using an '@' sign as a prefix to the filename (e\&.g\&. \fI@command\&.txt\fP)\&. Such a command argument is replaced by the content of the corresponding text file (multiple whitespaces are treated as a single separator unless they appear between two quotation marks) prior to any further evaluation\&. Please note that a command file cannot contain another command file\&. This simple but effective approach allows one to summarize common combinations of options/parameters and avoids longish and confusing command lines (an example is provided in file \fI/dumppat\&.txt\fP)\&. .SH "EXIT CODES" .PP The \fBdcm2json\fP utility uses the following exit codes when terminating\&. This enables the user to check for the reason why the application terminated\&. .SS "general" .PP .nf EXITCODE_NO_ERROR 0 EXITCODE_COMMANDLINE_SYNTAX_ERROR 1 .fi .PP .SS "input file errors" .PP .nf EXITCODE_CANNOT_READ_INPUT_FILE 20 EXITCODE_NO_INPUT_FILES 21 .fi .PP .SS "output file errors" .PP .nf EXITCODE_CANNOT_WRITE_OUTPUT_FILE 40 .fi .PP .SS "processing errors" .PP .nf EXITCODE_CANNOT_CONVERT_TO_UNICODE 80 EXITCODE_CANNOT_WRITE_VALID_JSON 81 .fi .PP .SH "ENVIRONMENT" .PP The \fBdcm2json\fP utility will attempt to load DICOM data dictionaries specified in the \fIDCMDICTPATH\fP environment variable\&. By default, i\&.e\&. if the \fIDCMDICTPATH\fP environment variable is not set, the file \fI/dicom\&.dic\fP will be loaded unless the dictionary is built into the application (default for Windows)\&. .PP The default behavior should be preferred and the \fIDCMDICTPATH\fP environment variable only used when alternative data dictionaries are required\&. The \fIDCMDICTPATH\fP environment variable has the same format as the Unix shell \fIPATH\fP variable in that a colon (":") separates entries\&. On Windows systems, a semicolon (";") is used as a separator\&. The data dictionary code will attempt to load each file specified in the \fIDCMDICTPATH\fP environment variable\&. It is an error if no data dictionary can be loaded\&. .PP The \fBdcm2json\fP utility will attempt to load character set mapping tables\&. This happens when DCMTK was compiled with the oficonv library (which is the default) and the mapping tables are not built into the library (default when DCMTK uses shared libraries)\&. .PP The mapping table files are expected in DCMTK's \fI\fP\&. The \fIDCMICONVPATH\fP environment variable can be used to specify a different location\&. If a different location is specified, those mapping tables also replace any built-in tables\&. .SH "COPYRIGHT" .PP Copyright (C) 2016-2025 by OFFIS e\&.V\&., Escherweg 2, 26121 Oldenburg, Germany\&.