Back to Workspace
AURA Core Documentation
AURA Architecture

AURA Documentation

Welcome to AURA, an institution-grade deterministic data analysis engine powered by Matrix Product States. Grounded in mathematical physics, AURA eliminates conversational hallucinations and ensures verifiable integrity across complex datasets.

Technical White Paper

"Grounding a Large Language Model with Tensor Network Coefficients to Force Deterministic Data Analysis." Read the foundation behind AURA's pure abstraction layer.

Read the Paper

Overview

AURA breaks away from standard RAG (Retrieval-Augmented Generation) patterns. Standard generative models guess at connections in tabular data, resulting in confident but mathematically flawed insights. AURA computes a Matrix Product State (MPS) topological layer entirely detached from the LLM, reducing all combinations into structural reality, allowing the generative interface to only translate truth.

Data Ingestion

Dragging and dropping CSV, JSON, or Excel files straight into the workspace initiates the ingestion layer. AURA immediately structures categorical metadata, numeric types, and row geometries, safely packaging them for encrypted transport to the hardware simulation phase.

Data Reconfiguration

Upon arrival, your arrays are dynamically mapped into Ising Spin glass topologies. Here, the values are transformed into quantum coefficients where a "Ground State" energy reduction process reorganizes rows continuously to discover invisible physical correlations within your operations.

Data Querying

Instead of navigating a dashboard of pre-built graphs, simply ask natural language questions. The Translation Ledger takes your query, compares it against the hard mathematical ledger compiled by the solver, and returns an answer guaranteed to reflect the data's absolute mathematical state.

Reading Validation Information

Every time AURA generates a response block, it displays a Confidence and Noise signature. The MPS hardware returns absolute percentages reflecting the convergence threshold. If the math correlates flawlessly, your confidence score remains locked at 99.9%.

Exporting Validation Info

Validation integrity artifacts can be saved. Your workspace allows you to export the detailed Ground State Energy metrics, execution telemetry, and deterministic correlation traces directly to PDF or JSON for compliance sharing.

Branching & "My Instances"

Aura supports infinite deterministic timelines for Professional tier users. When you upload data, it immediately assigns an Untitled [n] workspace mapping. At the top of your console, you can natively rename these instances and permanently "Save" them into the global registry.

Branching lets you fork the timeline of your dataset. By naming and saving multiple permutations, you can instantly swap between them securely using the "My Instances" dropdown unified in the global header, completely abstracting away the need to re-upload.

Industry Examples (5)

1. Pharmaceuticals

Mapping cross-reactions of organic compounds within clinical trials to discover outlier anomalies undetectable by standard regression.

2. Dynamic Logistics

Balancing a global supply chain where weather, warehouse capacity, and fleet availability are processed simultaneously to unblock friction.

3. Quantitative Finance

Interrogating high-frequency order books. Discovering correlation structures between dark pool trading volumes and spot volatility.

4. Grid Energy Management

Predicting grid strain across regional transformers based on temperature, residential influx arrays, and solar efficiency decay.

5. Defense / Cybersecurity

Correlating network packet ingress metadata to spot topological similarity arrays associated with distributed zero-day attacks.

Plans & Pricing

Choose the plan that fits your needs. Every plan includes the same powerful analysis engine — the only difference is how much data you can upload and the resources dedicated to your work.

Free
$0

Try everything with no commitment. Upload a spreadsheet, ask questions in plain English, and see real results — no credit card needed.

Upload files up to 50 MB
5 questions per session
Full accuracy & validation
Export results as JSON
Get Started Free
Standard
$20/ month

For individuals and small teams who need to work with real-world datasets. Ask unlimited questions and get consistent, accurate answers.

Upload files up to 200 MB
Unlimited questions
Shared compute resources
Download results as CSV or JSON
Upgrade to Standard
Most Popular
Professional
$199/ month

Built for professionals and organizations handling large or sensitive datasets. Get dedicated compute power, faster results, and full API access to integrate Aura into your own tools and workflows.

Upgrade to Pro
Upload files up to 2 TB
Unlimited questions
Dedicated compute — faster results
Priority processing queue
Full API access for automation
Save & manage workspace instances
Export detailed audit reports

Enterprise

Need custom data limits, dedicated servers, or a signed BAA for regulatory compliance? We build tailored plans for organizations with specific infrastructure and security requirements.

Contact Sales

Quick Comparison

FeatureFreeStandardPro
Max upload size50 MB200 MB2 TB
Questions per session5UnlimitedUnlimited
Compute resourcesSharedSharedDedicated
Processing priorityStandardStandardPriority
API access
Save workspaces
Audit report exportsBasicFull

Sovereign Financial Gateway

All subscription upgrades are processed via a Square Hosted Checkout. DYNSELL never touches or stores your raw card data. Your payment information is entered directly on Square's PCI-compliant infrastructure, ensuring absolute financial isolation from our research hardware.

No risk to try

The free plan requires no credit card. Upload your data and see results before you commit to anything.

Cancel anytime

Paid plans are month-to-month. Cancel or downgrade from your account settings whenever you want. No contracts, no penalties.

Your data stays yours

We never store, train on, or share your data. It is processed in isolated memory and destroyed when your session ends.

Accuracy Testing (In Plain Terms)

Unlike traditional Generative AI where "accuracy" depends on subjective evaluation, AURA's accuracy is a mathematical constant. We conduct Matrix Product sweep testing on synthetically generated datasets with known embedded correlations. The engine must retrieve these correlations at a rate exceeding 99% accuracy before any build is deployed to production.

PHI and HIPAA Architecture

You do NOT need to manually restructure or de-identify patient healthcare data (PHI) or personal identifiable information (PII) before upload. Aura handles this implicitly through its Zero-Trust logic structure. Data never touches a persistent SQL disk—payloads are mapped strictly into volatile edge RAM, calculated statelessly, and subjected to immediate cryptographic destruction when the connection closes.View formal HIPAA Policy Overview →

Intellectual Property & Privacy

We do not retain, train upon, or harvest our customers' data inputs. Period. When a session terminates, the node's vector arrays are destroyed. All correlation matrices, operational discoveries, and exported data belong solely and exclusively to your organization.

Frequently Asked Questions

What happens to my data if my browser window crashes?
Because we strictly enforce a Zero-Trust environment to protect your data, the edge node monitoring the websocket connection detects the dead socket and instantly flushes the volatile memory. The workspace instance is completely erased. You will need to start your query from scratch. Customer support cannot recover this data for you.
Why did my API throw a "402 Insufficient Tokens" error?
The $199/m Professional tier grants up to 2,500 intense queries per month. Heavy mathematical interpolations consume larger node cycles. If you exhaust your logic bandwidth, you must wait for the monthly cycle to refresh or immediately upgrade to an Enterprise bare-metal infrastructure. Account support will not manually override exhaustion limits.
Can I export the logic tree after returning to an closed session?
No. Because Aura enforces strict HIPAA and IP stateless security, the workspace cannot be "reopened" later unless you specifically issue an API command to save a copy locally while active. Once you navigate away from the dashboard, the intelligence is permanently scrubbed without exception.
Developer Integration

API Documentation

Required Consultation

Before granting programmatic access to our hardware layer, all prospective enterprise API integrators must complete a mandatory architecture consultation. This ensures your data constraints map perfectly to our deterministic solver geometry and prevents unnecessary computational overhead.

Pricing & Management

The pricing of the consultation and API is strictly managed by the user. The Consultation is booked through your primary portal or by contacting quantum@dynsell.com, and API keys are issued directly through your authenticated workspace hub dashboard.

  • $2,100 – Consultation and Development
  • $200 – Initial Provisioning Fee (Unlimited Use)
  • $899 – Monthly Maintenance Fee (Not required if self-maintained, though support delays from our engineers may be possible)
  • Scaled Provisioning Fee: Based directly on computational utilization, billed incrementally at every $50 interval.

The MPS Transporter Model

The OPENmps API should NOT be actively storing non-MPS data. We hold Matrix Product States (MPS) exclusively as a highly efficient transporter between your desired input and output sources. Dynsell acts purely as a stateless deterministic function.

Functional Example:

A field maintenance technician records a system status payload and presses "Submit." Instead of pushing raw unstructured logs into a traditional database, the submission payload routes through OPENmps. It is immediately rendered into an MPS array, transported to the final destination securely as an MPS, and perfectly reconstructed on the output—exactly as specified in our white paper. We utilize MPS truncation to compress any unstructured text dynamically. This rigorous topological transformation is precisely why initial consultation is required to set mathematical boundaries for execution.

OPENmps API

The OPENmps API is our custom, company-specific data analysis and management interface. It hashes all features of Dynsell into clean, sensible RESTful integrations. Above all, it completely protects the IP of the user. We do not pipe your data into OpenAI, ChatGPT, or public clouds. The process strictly entails:

  • Data inputs into the API.
  • The API pushes data through the exact Matrix Product State process outlined in the technical white paper.
  • Data is outputted securely into your specific environment exactly how you requested it. Consider it a secure transportation and data optimization protocol, as universally accessible as a data compressor.

Auth & Tokens

All API requests demand an active Bearer Authorization header generated securely via your account settings. Tokens can be scoped to read, execute, or root levels for modular deployment.

API Overview

Access base endpoint api.dynsell.com/api. Our services prioritize extreme low latency asynchronous executions for massive datasets, using webhook callbacks for final state delivery upon convergence.

Example: Synchronous Interrogation

# POST /api/registries/{id}/interrogate
curl -X POST https://api.dynsell.com/api/registries/$WORKSPACE_ID/interrogate \\
-H "Authorization: Bearer $AURA_TOKEN" \\
-H "Content-Type: application/json" \\
-d '{"query":"What is the anomaly variance?"}'

Math-Backed Promises

Rather than promising generalized "AI insights," our API guarantees a strict adherence to Hamiltonian math. We return structural topology mappings, Ground State Energy calculations, and deterministically sourced relationships.

Data Privacy

A Zero-Trust Guarantee

Data pushed through our API traverses closed networks. We do not use third-party LLM providers for data analysis, guaranteeing that your raw numbers never leak into global transformer training sets.

Data Types

OPENmps natively digests standard .CSV arrays and standardized JSON objects. Streaming support for massive payload chunks is accessible via multi-part upload pathways.

Flexibility of Integration

Designed to slip seamlessly behind your proprietary firewalls. Whether integrating AURA's analytical engine into your internal company dashboard, piping its output into a PostgreSQL reporting database, or utilizing it as a real-time data sanitizer, the API adjusts perfectly to existing DevSecOps constraints.