The Gigantum User Hub

Welcome to the Gigantum user hub. You'll find comprehensive guides and documentation to help you start working with the Gigantum platform as quickly as possible. If you get stuck, then there is help for that too.

Let's get to it!

Get Started    

Data De-Duplication

Datasets de-duplicate data to help save you local disk space. This is done in two ways. First, files are stored based on a hash of their content and are then hard linked for actual use. This means if two files have the exact same content, only one copy will be stored.

The second way files are de-duplicated is across Projects. A much more common case is the desire to use the same data in many Projects. If you embed the data in the Project, each Project will have to keep a complete copy of the data. With Datasets, since the files are only linked into the Project at runtime, only one copy of the files are needed.

Updated 6 months ago

Data De-Duplication

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.