Getting Started

The purpose of this article is to give users a guide on getting themselves set up with both a Conical instance as well as being able to upload the first set of data to the tool.

If you’re solely interested in installing a new version of Conical, then install instructions can be found here.


  • Ability to run a docker container (x86-x64 currently)
  • Access to a SQL server (2016 onwards or cloud hosted) instance

High level steps (server set up)

  1. Install Server
  2. Create product within Conical
    1. Create appropriate test run types
    2. Add initial components for each test run type (e.g. ResultsXml, ResultsJson etc.)
  3. [Optional] Create specific upload role and user for uploading test data to Conical.
  4. Create personal access token (this can be a ‘restricted scope’ token if the previous step was skipped or can be a ‘full privileges’ token if it will be reused for other products etc).

At this stage, your Conical instance will be fully functional and ready for use by the next step.

Data population – differences

NB The tool itself is agnostic as to the language used to call it (all communication is via a REST api and so access can be in any language). In this section we will assume that c# is used but the principles used are the same in any language.

If you prefer to read code examples rather than prose, then look at this GitHub repository which contains the source code for how we keep the demo instance up to date with daily data.

The main question in terms of the data population is to decide what you wish to have tested. The tool is designed to be able to store the results of many different types of tests, including different types within the same test portfolio (this is why the tool allows for multiple test run types).

Usually, the tests take the form of comparing the granular outputs of high level components. An example from financial services would be comparing the trade / position level results for a risk or P&L calculation for a portfolio. For the purposes of this article, this is assumed to be your use-case, where 2 calls can be made to your infrastructure, one for the set of expected results and one for the set of candidate results.

Once the choice has been made as to what’s being compared, then it behoves us to create the application to perform the comparisons and to upload the data to Conical. The most important thing to remember here is that the data structure for comparisons is determined by your use-case and not the tool.

In general, the pattern that we use is:

  • Define your ‘unit of difference’, typically a trade / position
  • Select a key for this unit, this is either an identifier or the location within the hosting array
  • Extract your objects
  • Compare the objects

NB The objects for comparison can be of any type, including Json documents (System.Text.Json or and XML documents. There are pre-supplied custom flatteners for these types available and custom flatteners / comparison plugins can be self-provided. This ability to compare JsonElement / JObject objects means that it’s very easy to compare arbitrary data structures as when they’re reflated from their Json representations, the full data structure will be retained.

The comparison of the objects themselves is usually done by flattening them down into a set of key-value pairs per object and then iterating through these sets (one for the candidate and one for the expected) to find any differences.

A concrete example of the above functionality is available here.

All of this functionality is available through Nuget:

The source code for all of these libraries is available on our public GitHub page.

Data population – uploading

Uploading to the tool can be done in 2 ways, either by using the .net access library (Nuget) or by directly calling the REST api if you have special requirements / are not using .net.

Instructions on how to use the Nuget / its data model are available on its GitHub page.

Note that to go from the comparison results to a format suitable for uploading, we typically serialise the object to Json and then publish that payload to the instance during upload. This can be done with any Json library. Note that the website will automatically format the uploaded Json document at display time so it’s not necessary to format prior to upload. The same also holds true if you use XML.

If you need to call the REST api directly, then navigate to /swagger on your instance and the API will be displayed. Note that we are looking to add native access clients for additional languages, so please do contact us if you have a particular requirement.


By this stage, you should now have the ability to:

  • Compare your chosen data structures
  • Push the results to your instance
  • View the results in your instance

You can now extend your app to cover more use-cases, be automated through your CI/CD process, run with multiple criteria etc. The choice is yours as to what’s the best use of your testing budget.

If you have any questions then please contact us at and we’ll get back to you.