⚡High Performance DICOM Medical Image Parser in Go.



High Performance Golang DICOM Medical Image Parser

👀 v1.0 just released!

This is a library and command-line tool to read, write, and generally work with DICOM medical image files in native Go. The goal is to build a full-featured, high-performance, and readable DICOM parser for the Go community.

After a fair bit of work, I've just released v1.0 of this library which is essentially rewritten from the ground up to be more canonical go, better tested, has new features, many bugfixes, and more (though there is always more to come on the roadmap).

Some notable features:

  • Parse multi-frame DICOM imagery (both encapsulated and native pixel data)
  • Channel-based streaming of Frames to a client as they are parsed out of the dicom
  • Cleaner Go Element and Dataset representations (in the absense of Go generics)
  • Better support for icon image sets in addition to primary image sets
  • Write and encode Datasets back to DICOM files
  • Enhanced testing and benchmarking support
  • Modern, canonical Go.


To use this in your golang project, import github.com/suyashkumar/dicom. This repository supports Go modules, and regularly tags releases using semantic versioning. Typical usage is straightforward:

dataset, _ := dicom.ParseFile("testdata/1.dcm", nil) // See also: dicom.Parse which has a generic io.Reader API.

// Dataset will nicely print the DICOM dataset data out of the box.

// Dataset is also JSON serializable out of the box.
j, _ := json.Marshal(dataset)

More details about the package (and additional examples and APIs) can be found in the godoc.

CLI Tool

A CLI tool that uses this package to parse imagery and metadata out of DICOMs is provided in the cmd/dicomutil package. This tool can take in a DICOM, and dump out all the elements to STDOUT, in addition to writing out any imagery to the current working directory either as PNGs or JPEG (note, it does not perform any automatic color rescaling by default).


You can download the prebuilt binaries from the releases tab, or use the following to download the binary at the command line using my getbin tool:

wget -qO- "https://getbin.io/suyashkumar/dicom" | tar xvz

(This attempts to infer your OS and 301 redirects wget to the latest github release asset for your system. Downloads come from GitHub releases).


dicomutil -path myfile.dcm

Note: for some DICOMs (with native pixel data) no automatic intensity scaling is applied yet (this is coming). You can apply this in your image viewer if needed (in Preview on mac, go to Tools->Adjust Color).

Build manually

To build manually, ensure you have make and go installed. Clone (or go get) this repo into your $GOPATH and then simply run:


Which will build the dicomutil binary and include it in a build/ folder in your current working directory.

You can also built it using Go directly:

go build -o dicomutil ./cmd/dicomutil


Here's a little more history on this repository for those who are interested!


The v0 suyashkumar/dicom started off as a hard fork of go-dicom which was not being maintained actively anymore (with the original author being supportive of my fork--thank you!). I worked on adding several new capabilities, bug fixes, and general maintainability refactors (like multiframe support, streaming parsing, updated APIs, low-level parsing bug fixes, and more).

That represents the v0 history of the repository.


For v1 I rewrote and redesigned the core library essentially from scratch, and added several new features and bug fixes that only live in v1. The architecture and APIs are completely different, as is some of the underlying parser logic (to be more efficient and correct). Most of the core rewrite work happend at the s/1.0-rewrite branch.


  • Feature/person name parser

    Feature/person name parser

    Adds helper functions and data structures for parsing / writing dicom PN (Person Name) values as defined here.

    New functions and types are part of the newly created pkg/personName package.

    I am more than happy to make any and all API changes you feel would be better in line with the greater package.

    Example parsing PN value:

    // A Raw PN string value with Alphabetic, Ideographic, and Phonetic information.
    rawPN := "Potter^Harry^James^^=哈利^波特^詹姆^^=hɛər.i^pɒ.tər^dʒeɪmz^^"
    // To parse a PN value, use personName.Parse
    parsedPN, err := personName.Parse(rawPN)
    if err != nil {
    // PN values are broken into three groups: Alphabetic, Phonetic, and Ideographic.
    // which can be accessed through getter methods.
    fmt.Println("ALPHABETIC:", parsedPN.Alphabetic.String())
    fmt.Println("IDEOGRAPHIC:", parsedPN.Ideographic.String())
    fmt.Println("PHONETIC:", parsedPN.Phonetic.String())
    // Each Group info in turn has a struct field for retrieving individual segment data.
    fmt.Println("FAMILY NAME:", parsedPN.Alphabetic.FamilyName)
    fmt.Println("GIVEN NAME: ", parsedPN.Alphabetic.GivenName)
    fmt.Println("MIDDLE NAME:", parsedPN.Alphabetic.MiddleName)
    fmt.Println("NAME PREFIX:", parsedPN.Alphabetic.NamePrefix)
    fmt.Println("NAME SUFFIX:", parsedPN.Alphabetic.NameSuffix)
    // To print the original Raw value, simply use the string method.
    fmt.Println("ORIGINAL RAW:", parsedPN.String())


    ALPHABETIC: Potter^Harry^James^^
    IDEOGRAPHIC: 哈利^波特^詹姆^^
    PHONETIC: hɛər.i^pɒ.tər^dʒeɪmz^^
    FAMILY NAME: Potter
    GIVEN NAME:  Harry
    MIDDLE NAME: James
    ORIGINAL RAW: Potter^Harry^James^^=哈利^波特^詹姆^^=hɛər.i^pɒ.tər^dʒeɪmz^^

    Example creating a PN value:

    // Create a new PN like so
    pnVal := personName.New(
    		// This PN group will render trailing separators.
    	// Add empty group that will render its separators.
    	// Add empty group that will render its separators.
    	// Remove groups from string render that do not add information.
    // Print the string, should render as 'Potter^Harry^James^^'. Even though those
    // groups were set to render trailing separators, because they are empty, they are
    // suppressed.
    fmt.Println("PN 1:", pnVal.String())
    // Now let's make one that still renders empty groups with String().
    pnVal = personName.New(
    		// This PN group will render trailing separators.
    	// Add empty group that will render its separators.
    	// Add empty group that will render its separators.
    	// Do not remove groups from string render that do not add information.
    // This will render as 'Potter^Harry^James^^=^^^^=^^^^'
    fmt.Println("PN 2:", pnVal.String())


    PN 1: Potter^Harry^James^^
    PN 2: Potter^Harry^James^^=^^^^=^^^^

    personName.NewGroupEmpty() is a QoL method which yields a personName.GroupInfo object with all segments set to s null string.

    opened by bpeake-illuscio 24
  • Force flag to override

    Force flag to override "Keyword 'DICM' not found in the header" (issue #59)

    @suyash I realized while I was writing this PR that the header (func parseFileHeader), is parsed inside of NewParser, which is where the override for the 'keyword not found' error goes. The original plan was to add a flag that the implementer could set after the creation of the parser and before the actual parsing of the file, but this is not an option because of when parseFileHeader is called

    I see 3 options that could fix this. (1) we could move the parseFileHeader call out of NewParser and into func (p *parser) Parse. This would have the implication that the whole function would have to be parsed before the meta data could be read. (2) we could add the force flag as a function parameter in NewParser and subsequent parser-generating functions. This would have the implication of breaking backwards compatibility, however, since as far as I can tell golang doesn't have optional function parameters like python or other languages. (3) We could do a different version of option 2 and duplicate the NewParser functions and add the force flag as a parameter (something like func NewParserFromFileForce(path string, frameChannel chan *frame.Frame) (Parser, error) which would call func NewParserForce(in io.Reader, bytesToRead int64, frameChannel chan *frame.Frame) (Parser, error)) which would fix the backwards computability issue but add bloat code (Note: this is less convenient and the names have to be changed because it does not appear that golang does not support function overloading. I went ahead and implemented option 3 in this PR, but I can go back and change it to whichever option you would like me to (or any other that I missed). Let me know what works best

    I also added a commit that fixes the formatting for errors on the dicomutil tool. There was an issue where it would state it was panicking, then there was no space and it would print the error on the end of the last word in the panic message. I just added ": " to the end of the panic message and before the error is printed so they are distinct and more readable

    opened by IChocked 16
  • Read/write of  SL tag

    Read/write of SL tag

    Dear all, great work. I have an application using this library to alter a few UIDs. When I load the altered dicom instance using pydicom, pydicom erred with this message:

    pydicom.errors.BytesLengthException: Expected total bytes to be an even multiple of bytes per value. Instead received b'e\x01' with length 2 and struct format 'l' which corresponds to bytes per value of 4. This occurred while trying to parse (0019, 1026) according to VR 'SL'. To replace this error with a warning set pydicom.config.convert_wrong_length_to_UN = True.

    Here is the tag before and after: (0019,1026) SL 357 # 4, 1 DegreesOfAzimuth

    (0019,1026) SL # 2, 0 DegreesOfAzimuth

    It looks like the tag is altered and I can confirm we are not programmatically changing (0019,1026)

    Any idea what the issue is?


    opened by chuijh 12
  • Dataset.FlatIterator() leaks feeding goroutine if not exhausted.

    Dataset.FlatIterator() leaks feeding goroutine if not exhausted.


    While writing some code that uses Dataset.FlatIterator(), I realized that unless the Iterator is exhausted, a goroutine is leaked. That is, the goroutine feeding the iterator will never exit.

    Here's a compact example:

    package main
    import (
    var DicomPaths = []string{
    // DoSomething will be used to simulate an error occurring during iteration.
    func DoSomething(element *dicom.Element) error {
    	return errors.New("error doing something with element")
    // DatasetInspector ranges ofer Dataset.FlatIterator() and does something that can fail
    // with each element.
    func DatasetInspector(filepath string) error {
    	// First we read the dataset
    	dataset, err := dicom.ParseFile(filepath, nil)
    	if err != nil {
    	// Now we iterate over all of our elements and call DoSomething() on each.
    	for element := range dataset.FlatIterator() {
    		err = DoSomething(element)
    		if err != nil {
    			return err
    	return nil
    func main() {
    	// Call our inspector on 5 dicom files. Each one should error and will result in a leaked goroutine.
    	for _, thisFile := range DicomPaths {
    		err := DatasetInspector(thisFile)
    		if err != nil {
    			log.Println("ERROR:", err)
    	// Force garbage collection to show these goroutines are not cleaned up.
    	// If we print the runtime stack we'll see that we have leaked goroutines.
    	buffer := make([]byte, 5012)
    	runtime.Stack(buffer, true)
    	fmt.Println("\nCURRENT STACK:\n", string(buffer))

    Now let's look at our output:

    2021/03/10 13:32:10 ERROR: error doing something with element
    2021/03/10 13:32:10 ERROR: error doing something with element
    2021/03/10 13:32:10 ERROR: error doing something with element
    2021/03/10 13:32:10 ERROR: error doing something with element
    2021/03/10 13:32:10 ERROR: error doing something with element
     goroutine 1 [running]:
            .../dicom/example/main.go:60 +0xf6
    goroutine 6 [chan send]:
    github.com/suyashkumar/dicom.flatElementsIterator(0xc000106400, 0x6d, 0x80, 0xc0000200c0)
            .../dicom/dataset.go:85 +0x147
    github.com/suyashkumar/dicom.(*Dataset).FlatIterator.func1(0xc00000c0d8, 0xc0000200c0)
            .../dicom/dataset.go:70 +0x4a
    created by github.com/suyashkumar/dicom.(*Dataset).FlatIterator
            .../dicom/dataset.go:69 +0x67
    goroutine 7 [chan send]:
    github.com/suyashkumar/dicom.flatElementsIterator(0xc000106800, 0x6c, 0x80, 0xc000020120)
            .../dicom/dataset.go:85 +0x147
    github.com/suyashkumar/dicom.(*Dataset).FlatIterator.func1(0xc00000d8c0, 0xc000020120)
            .../dicom/dataset.go:70 +0x4a
    created by github.com/suyashkumar/dicom.(*Dataset).FlatIterator
            .../dicom/dataset.go:69 +0x67
    goroutine 37 [chan send]:
    github.com/suyashkumar/dicom.flatElementsIterator(0xc000108800, 0x86, 0x100, 0xc000020180)
            .../dicom/dataset.go:85 +0x147
    github.com/suyashkumar/dicom.(*Dataset).FlatIterator.func1(0xc0002b4ff0, 0xc000020180)
            .../dicom/dataset.go:70 +0x4a
    created by github.com/suyashkumar/dicom.(*Dataset).FlatIterator
            .../dicom/dataset.go:69 +0x67
    goroutine 13 [chan send]:
    github.com/suyashkumar/dicom.flatElementsIterator(0xc000106000, 0x59, 0x80, 0xc0000201e0)
            .../dicom/dataset.go:85 +0x147
    github.com/suyashkumar/dicom.(*Dataset).FlatIterator.func1(0xc00000c0d8, 0xc0000201e0)
            .../dicom/dataset.go:70 +0x4a
    created by github.com/suyashkumar/dicom.(*Dataset).FlatIterator
            .../dicom/dataset.go:69 +0x67
    goroutine 14 [chan send]:
    github.com/suyashkumar/dicom.flatElementsIterator(0xc000106c00, 0x52, 0x80, 0xc000020240)
            .../dicom/dataset.go:85 +0x147
    github.com/suyashkumar/dicom.(*Dataset).FlatIterator.func1(0xc00009e198, 0xc000020240)
            .../dicom/dataset.go:70 +0x4a
    created by github.com/suyashkumar/dicom.(*Dataset).FlatIterator
            .../dicom/dataset.go:69 +0x67
    Process finished with exit code 0

    You can see that even after garbage collection, all of the routines feeding our aborted iterations are still running. This means in a long-running service setting, any failure within the range loop will leak the goroutine feeding the iterator, which means also leaking the entire DICOM []*dicom.Element feeding it, as we can see from Goland's Debugger, with a breakpoint after our main loop:

    Screen Shot 2021-03-10 at 1 44 16 PM

    The most straightforward fix, in my opinion, would be to make the method context-aware, so the code could be set up as follows:

    iterCtx, iterCancel := context.WithCancel(context.Background())
    defer iterCancel()
    // Now we iterate over all of our elements, looking for PN values.
    for element := range dataset.FlatIterator(iterCtx) {
    	// some code

    But that is also a breaking change. I'm happy to take a crack at a PR if you would like, with this approach or another.

    bug enhancement 
    opened by bpeake-illuscio 12
  • Introduce element-by-element Writer API, add SkipMetadataReadOnNewParserInit Parser Option, add ParseOptions to Parser API.

    Introduce element-by-element Writer API, add SkipMetadataReadOnNewParserInit Parser Option, add ParseOptions to Parser API.

    This is my proposal for a Writer struct to make single element writing available to other packages, as discussed in #206. It consists of a new Writer struct, with WriteDataset and and WriteElement methods, for whole dataset and single element writing respectively.

    opened by kristianvalind 10
  • pkg/dcmtime: Date and Time Parsing

    pkg/dcmtime: Date and Time Parsing

    Adds parsing support for DICOM DA (date), TM (time), and DT (date) values.

    AS (age string) is not handled by this PR.

    Example parsing DT value:

    // This is a DT value like we would expect
    daString := "20201210123001.000431+0100"
    // We are parsing the date string without allowing nema
    dt, err := ParseDT(daString)
    if err != nil {
    fmt.Println("TIME VALUE:", dt.Time)
    fmt.Println("PRECISION :", dt.Precision)
    fmt.Println("HAS OFFSET:", dt.HasOffset)
    // Output:
    // TIME VALUE: 2020-12-10 12:30:01.000431 +0100 +0100
    // HAS OFFSET: true

    Dicom time values not only encode a time, but also encode the precision of that time. If a DT value only specifies a year, that does not mean that we should assume January 1st, it means nothing more specific than the year is known or relevant.

    The returned objects will have precision information, and in the case of DT, timezone offset presence data as well.

    Here is an example of parsing a much more bare bones DT value:

    // This is a DT value like we would expect
    daString := "2020121012"
    // We are parsing the date string without allowing nema
    dt, err := ParseDT(daString)
    if err != nil {
    fmt.Println("TIME VALUE:", dt.Time)
    fmt.Println("PRECISION :", dt.Precision)
    fmt.Println("HAS OFFSET:", dt.HasOffset)
    // Output:
    // TIME VALUE: 2020-12-10 12:00:00 +0000 +0000
    // HAS OFFSET: false

    We can also create new DT objects as well:

    // We'll use the reference date as our date
    timeVal, err := time.Parse(
    if err != nil {
    // Create a nw TM object like so:
    dt := NewDT(timeVal, Precision.Full, false)
    // Get the DICOM string value
    fmt.Println("DT:", dt.DCM())
    // Output:
    // DT: 20060102150405.123456+0100

    When making a new DT object, we must specify the precision of our time.TIme value, and whether the offset is known:

    // We'll use the reference date as our date
    timeVal, err := time.Parse(
    if err != nil {
    // Create a nw TM object like so. If we set ignoreOffset to true, it will not be reflected in the DT value:
    dt := NewDT(timeVal, Precision.Minutes, true)
    // Get the DICOM string value
    fmt.Println("DT:", dt.DCM())
    // Output:
    // DT: 200601021504

    Possibly closes #148

    opened by bpeake-illuscio 10
  • Keyword 'DICM' not found in the header (file offset 132)

    Keyword 'DICM' not found in the header (file offset 132)


    I have an XRay DICOM failing to initialize the parser b/c of:

    Keyword 'DICM' not found in the header (file offset 132)

    I can dump it with with fine dcmtk [dcmdump]. Good news is that the competition fails as well:

    Traceback (most recent call last):
      File "../bench/dicom_bench.py", line 29, in <module>
        ds = pydicom.dcmread(fn)
      File "/home/ubuntu/.local/lib/python3.6/site-packages/pydicom/filereader.py", line 870, in dcmread
        force=force, specific_tags=specific_tags)
      File "/home/ubuntu/.local/lib/python3.6/site-packages/pydicom/filereader.py", line 667, in read_partial
        preamble = read_preamble(fileobj, force)
      File "/home/ubuntu/.local/lib/python3.6/site-packages/pydicom/filereader.py", line 620, in read_preamble
        raise InvalidDicomError("File is missing DICOM File Meta Information "
    pydicom.errors.InvalidDicomError: File is missing DICOM File Meta Information header or the 'DICM' prefix is missing from the header. Use force=True to force reading.
    opened by wkoszek 9
  • Write Library changes to S/1.0 rewrite Branch

    Write Library changes to S/1.0 rewrite Branch

    This PR includes a few minor changes to the write library which allows rewritten DICOMs to be identical to the original files. Notable changes include:

    • made newElement() and mustNewElement() exported functions so that they can be used to create artificial Datasets for writing
    • Added writing of the FileMetaInformationVersion header in the correct spot to avoid "Dataset not in ascending order" error
    • Removed an error check that the Values array passes to the writeBytes() function is only of length 1
    • Removed a duplicate call to encodeElementHeader() when writing PixelData
    • Added "DS" and "CS" VRs to the list of string type headers that should be padded with a space, as opposed to NULL

    I'll be testing this solution at scale on thousands of open DICOMs to ensure compatibility with as many files as possible.

    Here is a dcmdump of the DICOM I used for testing. Note that there are two values for FileMetaInformationVersion:

    # Dicom-File-Format
    # Dicom-Meta-Information-Header
    # Used TransferSyntax: Little Endian Explicit
    (0002,0000) UL 196                                      #   4, 1 FileMetaInformationGroupLength
    (0002,0001) OB 00\01                                    #   2, 1 FileMetaInformationVersion
    (0002,0002) UI =ComputedRadiographyImageStorage         #  26, 1 MediaStorageSOPClassUID
    (0002,0003) UI [] #  54, 1 MediaStorageSOPInstanceUID
    (0002,0010) UI =LittleEndianImplicit                    #  18, 1 TransferSyntaxUID
    (0002,0012) UI []            #  28, 1 ImplementationClassUID
    (0002,0013) SH [OFFIS_DCMTK_362]                        #  16, 1 ImplementationVersionName
    # Dicom-Data-Set
    # Used TransferSyntax: Little Endian Implicit
    (0008,0000) UL 288                                      #   4, 1 GenericGroupLength
    (0008,0008) CS [ORIGINAL PRIMARY]                       #  16, 1 ImageType
    (0008,0012) DA [1995.07.20]                             #  10, 1 InstanceCreationDate
    (0008,0016) UI =ComputedRadiographyImageStorage         #  26, 1 SOPClassUID
    (0008,0018) UI [] #  54, 1 SOPInstanceUID
    (0008,0022) DA [1993.02.24]                             #  10, 1 AcquisitionDate
    (0008,0032) TM [17:58:00]                               #   8, 1 AcquisitionTime
    (0008,0050) SH [FUJI95714]                              #  10, 1 AccessionNumber
    (0008,0060) CS [CR]                                     #   2, 1 Modality
    (0008,0070) LO [FUJI PHOTO FILM CO. LTD.]               #  24, 1 Manufacturer
    (0008,0080) LO [FUJI PHOTO FILM CO.,]                   #  20, 1 InstitutionName
    (0008,1040) LO [!!!!!!!!]                               #   8, 1 InstitutionalDepartmentName
    (0008,1090) LO [9000]                                   #   4, 1 ManufacturerModelName
    (0010,0000) UL 0                                        #   4, 1 GenericGroupLength
    (0018,0000) UL 140                                      #   4, 1 GenericGroupLength
    (0018,0015) CS [CHEST]                                  #   6, 1 BodyPartExamined
    (0018,1050) DS [0.2]                                    #   4, 1 SpatialResolution
    (0018,1260) SH [ST]                                     #   2, 1 PlateType
    (0018,1261) LO [5]                                      #   2, 1 PhosphorType
    (0018,1400) LO [[CHEST,GENERAL     ]]                   #  20, 1 AcquisitionDeviceProcessingDescription
    (0018,1401) LO [00]                                     #   2, 1 AcquisitionDeviceProcessingCode
    (0018,1402) CS [PORTRAIT]                               #   8, 1 CassetteOrientation
    (0018,1403) CS [35CMX35CM]                              #  10, 1 CassetteSize
    (0018,5101) CS [PA]                                     #   2, 1 ViewPosition
    (0018,6000) DS [357]                                    #   4, 1 Sensitivity
    (0020,0000) UL 134                                      #   4, 1 GenericGroupLength
    (0020,000d) UI [] #  54, 1 StudyInstanceUID
    (0020,000e) UI [] #  54, 1 SeriesInstanceUID
    (0020,0013) IS [16]                                     #   2, 1 InstanceNumber
    (0028,0000) UL 124                                      #   4, 1 GenericGroupLength
    (0028,0002) US 1                                        #   2, 1 SamplesPerPixel
    (0028,0004) CS [MONOCHROME1]                            #  12, 1 PhotometricInterpretation
    (0028,0010) US 440                                      #   2, 1 Rows
    (0028,0011) US 440                                      #   2, 1 Columns
    (0028,0034) IS [01]                                     #   2, 1 PixelAspectRatio
    (0028,0100) US 16                                       #   2, 1 BitsAllocated
    (0028,0101) US 10                                       #   2, 1 BitsStored
    (0028,0102) US 9                                        #   2, 1 HighBit
    (0028,0103) US 0                                        #   2, 1 PixelRepresentation
    (0028,1050) DS [550]                                    #   4, 1 WindowCenter
    (0028,1051) DS [1024]                                   #   4, 1 WindowWidth
    (2010,0000) UL 14                                       #   4, 1 GenericGroupLength
    (2010,0100) CS [BLACK]                                  #   6, 1 BorderDensity
    (7fe0,0000) UL 387208                                   #   4, 1 GenericGroupLength
    (7fe0,0010) OW 0000\00d1\030e\03b2\03ba\03bc\03bc\03bc\03be\03bf\03c0\03bc\03ab... # 387200, 1 PixelData
    opened by jasperdenotter 6
  • Setter for Value Types in element.go?

    Setter for Value Types in element.go?


    Pre-release v1.0.0-alpha.2


    One of the variables for the Element struct is Value, which is itself a Value struct (see here). This struct presumably stores the main data for the element (e.g. patient name). Each Value struct (e.g. StringsValue) contains a single variable called value. value, however, is an unexported value because it's lowercase and the implementer of the library has no access to it.


    Unable to create a new element with an actual value because we don't have access to the value variable in any *Value struct outside the dicom package. For example, attempting to create an element with this:

    studyDescElem := dicom.Element{
    		Tag: dicomtag.Tag{0x0008, 0x1030},
    		RawValueRepresentation:  "LO",
    		Value: &dicom.StringsValue{
    			value: []string{studyDescription},

    Yields the error unknown field 'value' in struct literal of type dicom.StringsValue

    Expected behavior

    It is useful for the implementer to be able to create an Element struct (e.g. for dicom generation during unit testing), like this:

    e := &Element{
    		Tag:                    tag.Rows,
    		ValueRepresentation:    tag.VRInt32List,
    		RawValueRepresentation: "US",
    		Value: &intsValue{
    			value: []int{100},


    Add setter functions for each *Value struct (e.g. func (s *StringsValue) Set([]string)) or capitalize the value variable in each *Value struct.

    @suyashkumar let me know what you think, and if I'm missing something or if there's another way to create an element with a value :)

    opened by IChocked 6
  • Feature/Value Multiplicity Info

    Feature/Value Multiplicity Info

    Once again, I love this lib. Here's a PR with some functionality I have been needing in my own work, refactored to be a first class citizen of the lib.

    Sometimes when working with Dicoms (especially when writing), it is useful to inspect the Value Multiplicity / Cardinality of a tag to make sure it lines up with the dicom spec.

    Right now we just get a raw string like '1', '1-2', '1-n', or '2-2n'. Inside of these stings is some encoded information that is important if we are verifying a dicom is valid per the spec.

    This PR introduces a new VMInfo field on the tag.Info struct which is a struct of the same name.

    VMInfo in turn, declares the following fields:

    • Minimum: The minimum number of values. 2 when VM is equal to '2', '2-n' or '2-2n'.
    • Maximum: The maximum number of values. 2 when VM is equal to '2', or '1-2', and -1 when unbounded (vm is '1-n')
    • Step: 1 except when VM is similar to '2-2n', in which case it is equal to 2.

    The python script for auto-generating our dict of known tags has been updated to include this information. There were a few places the type names it generated seemed out of date so those were updated.

    Tests have been added to ensure:

    1. All tags in the current parsed spec are parsed correctly.
    2. Helper function on the VMInfo object work correctly.

    Let me know what you think, and if this is something that might be useful for your library.


    If I were designing the API from scratch I would probably replace "VM" with this VMInfo object, then move the raw string into VMInfo as a "Raw" string field, but this would obviously break backward compatibility.

    opened by bpeake-illuscio 5
  • Add raw VR consts

    Add raw VR consts

    Hello! I love this library, I hope you do not mind me opening a few PR's for some quick quality of life features. I'll start with this one and once I get some feedback move on to a few others.

    In using this library, I've found it useful to have a set of aliases for all the raw value representations defined in the dicom spec, for instance:

    if element.RawValueRepresentation == vrraw.PersonName {
        // do something

    This PR adds a vrraw package with a set of string consts for inspecting the RawValueRepresentation without having to worry about typos.

    The package name "vrraw" was chosen over "vr" to:

    1. Match more closely element.RawValueRepresentation
    2. Avoid importing vr into the namespace and removing that value from variable name availability

    I also considered the possibility of making a VRRaw custom type off of string, like VRKind, and have these consts use that type. In such a case, we would also want to change element.RawValueRepresentation to the new type, but such a change could conceivably break code.

    This most common case would still work:

    // Go will cast the literal to the correct type at compile-time.
    if element.RawValueRepresentation == "PN" {
        // do something

    But this case would break:

    func ReturnsPN() string {
        return "PN"
    // You can't compare values of two different types, and non-literals are not automatically casted. So this would break.
    if element.RawValueRepresentation == ReturnsPN() {
        // do something

    As would this:

    func TakesVR(vr string) string {
        return "PN"
    // This will break: passed variables cannot be implicitly casted.

    Therefore, in order to keep backwards compatibility I kept this PR small in scope: make string consts for the Dicom-spec-defined Value Representations, but do not make these values their own type.

    Let me know what you think! Thank you!

    opened by bpeake-illuscio 5
  • Add `SkipPixelData` option to `Parser`

    Add `SkipPixelData` option to `Parser`

    • For heavy imaging modalities, it is expensive to load the pixel data into memory, and for many use cases, unnecessary. This new option allows users to parse the other attributes without the resource hit.


    $ go test -v -run TestNewParser ./...
    === RUN   TestNewParserSkipMetadataReadOnNewParserInit
    --- PASS: TestNewParserSkipMetadataReadOnNewParserInit (0.00s)
    === RUN   TestNewParserSkipPixelData
    --- PASS: TestNewParserSkipPixelData (0.01s)
    === RUN   TestNewParserPixelData
    --- PASS: TestNewParserPixelData (0.62s)
    ok      github.com/suyashkumar/dicom    0.677s
    ?       github.com/suyashkumar/dicom/cmd/dicomutil      [no test files]
    ?       github.com/suyashkumar/dicom/mocks/pkg/dicomio  [no test files]
    ?       github.com/suyashkumar/dicom/pkg/charset        [no test files]
    testing: warning: no tests to run
    ok      github.com/suyashkumar/dicom/pkg/dcmtime        0.011s [no tests to run]
    ?       github.com/suyashkumar/dicom/pkg/debug  [no test files]
    ?       github.com/suyashkumar/dicom/pkg/dicomio        [no test files]
    testing: warning: no tests to run
    ok      github.com/suyashkumar/dicom/pkg/frame  0.003s [no tests to run]
    testing: warning: no tests to run
    ok      github.com/suyashkumar/dicom/pkg/personname     0.017s [no tests to run]
    testing: warning: no tests to run
    ok      github.com/suyashkumar/dicom/pkg/tag    0.007s [no tests to run]
    ?       github.com/suyashkumar/dicom/pkg/uid    [no test files]
    ?       github.com/suyashkumar/dicom/pkg/vrraw  [no test files]
    opened by tcheever 6
  • fix the problem that cannot parse dicom file with SQ of undefined length

    fix the problem that cannot parse dicom file with SQ of undefined length

    Hello, I find a problem when I use it to parsing dicom file. The problem occurs when parsing a dicom file with SQ of undefined length, which has an Item Delimitation Tag (FFFE, E00D) and Sequence Delimitation Tag(FFFE, E0DD). Parser could parse these tags, but it will try to read the value of these tag and regard it as an string value. Because of the length of such item is unmeaningful FFFFFFFF, the parser will read all left data as an string value and occur an EOF exception.

    A simple solution is to judge the tag in function readElement, if it is Delimitation Tag, just return, not try to read the value of it. For verify this solution, I create a special dicom file, 'testdata/6.dcm' ,which only have SQ of undefined length in dataset.

    For more detail information about SQ of undefined length, click here and read the Table 7.5-3

    opened by Louis296 0
  • Can not load dicom

    Can not load dicom

    I can't load this dicom, buy pydicom can with "force=True" parameter. Does there have any hotfix for this kind dicom?

    err message read.go:534: error reading value unexpected EOF test-dicom.dcm.zip

    opened by elviskuo07 1
  • De-identify DICOM tags

    De-identify DICOM tags

    I'm trying to update/de-identify some of the metadata tags (such as Patient ID). Is there a way to update the file directly without reading the file and writing back to the file? Currently, I'm reading the file into a variable, updating the elements, and writing back to a new file as below:

    dataset, _ := dicom.ParseFile("dicom/1.dcm", nil)
        for _, elem := range dataset.Elements {
            s := reflect.TypeOf(elem.Value.GetValue()).Kind()
            if s == reflect.Slice {
                elem.Value, _ = dicom.NewValue([]string{"abc"})
        f, err := os.Open("test.dcm")
        err = dicom.Write(f, dataset)
    opened by vijaysubramanya 2
Suyash Kumar
Software Engineer @google health, formerly founder at @gradienthealth and @microelastic, eng @uber, BSE in Biomedical Eng & Computer Science from Duke.
Suyash Kumar
Image - This repository holds supplementary Go image librariesThis repository holds supplementary Go image libraries

Go Images This repository holds supplementary Go image libraries. Download/Insta

null 0 Jan 5, 2022
darkroom - An image proxy with changeable storage backends and image processing engines with focus on speed and resiliency.

Darkroom - Yet Another Image Proxy Introduction Darkroom combines the storage backend and the image processor and acts as an Image Proxy on your image

Gojek 203 Dec 6, 2022
Easily customizable Social image (or Open graph image) generator

fancycard Easily customizable Social image (or Open graph image) generator Built with Go, Gin, GoQuery and Chromedp Build & Run Simply, Clone this rep

Youngbin Han 4 Jan 14, 2022
An API which allows you to upload an image and responds with the same image, stripped of EXIF data

strip-metadata This is an API which allows you to upload an image and responds with the same image, stripped of EXIF data. How to run You need to have

Cristina Simionescu 0 Nov 25, 2021
Imgpreview - Tiny image previews for HTML while the original image is loading

imgpreview This is a Go program that generates tiny blurry previews for images t

Dmitry Chestnykh 8 May 22, 2022
Go package for fast high-level image processing powered by libvips C library

bimg Small Go package for fast high-level image processing using libvips via C bindings, providing a simple programmatic API. bimg was designed to be

Tom 2.1k Jan 2, 2023
Fast, simple, scalable, Docker-ready HTTP microservice for high-level image processing

imaginary Fast HTTP microservice written in Go for high-level image processing backed by bimg and libvips. imaginary can be used as private or public

Tom 4.7k Jan 3, 2023
OpenStreetMap PBF golang parser

pbf OpenStreetMap PBF golang encoder/decoder A golang based OpenStreetMap PBF encoder/decoder with a handy command line utility, pbf. pbf Command Line

Alan D. Cabrera 40 Oct 23, 2022
Image processing algorithms in pure Go

bild A collection of parallel image processing algorithms in pure Go. The aim of this project is simplicity in use and development over absolute high

Anthony N. Simon 3.7k Jan 6, 2023
Image processing library and rendering toolkit for Go.

blend Image processing library and rendering toolkit for Go. (WIP) Installation: This library is compatible with Go1. go get github.com/phrozen/blend

Guillermo Estrada 61 Nov 11, 2022
Decode embedded EXIF meta data from image files.

goexif Provides decoding of basic exif and tiff encoded data. Still in alpha - no guarantees. Suggestions and pull requests are welcome. Functionality

Robert Carlsen 545 Dec 17, 2022
A lightning fast image processing and resizing library for Go

govips A lightning fast image processing and resizing library for Go This package wraps the core functionality of libvips image processing library by

David Byttow 816 Jan 8, 2023
Imaging is a simple image processing package for Go

Imaging Package imaging provides basic image processing functions (resize, rotate, crop, brightness/contrast adjustments, etc.). All the image process

Grigory Dryapak 4.5k Dec 30, 2022
Pure golang image resizing

This package is no longer being updated! Please look for alternatives if that bothers you. Resize Image resizing for the Go programming language with

Jan Schlicht 2.9k Jan 9, 2023
smartcrop finds good image crops for arbitrary crop sizes

smartcrop smartcrop finds good image crops for arbitrary sizes. It is a pure Go implementation, based on Jonas Wagner's smartcrop.js Image: https://ww

Christian Muehlhaeuser 1.7k Jan 8, 2023
Go package for decoding and encoding TARGA image format

tga tga is a Go package for decoding and encoding TARGA image format. It supports RLE and raw TARGA images with 8/15/16/24/32 bits per pixel, monochro

Sigrid Solveig Haflínudóttir 32 Sep 26, 2022
:triangular_ruler: Create beautiful generative image patterns from a string in golang.

geopattern Create beautiful generative image patterns from a string in golang. Go port of Jason Long's awesome GeoPattern library. Read geopattern's d

Pravendra Singh 1.2k Dec 29, 2022
Go Image Filtering Toolkit

GO IMAGE FILTERING TOOLKIT (GIFT) Package gift provides a set of useful image processing filters. Pure Go. No external dependencies outside of the Go

Grigory Dryapak 1.6k Dec 23, 2022
Go Perceptual image hashing package

goimagehash Inspired by imagehash A image hashing library written in Go. ImageHash supports: Average hashing Difference hashing Perception hashing Wav

Dong-hee Na 556 Jan 3, 2023