Prerequisites
NeuroSim Core requires two external services before it will start: Apache Kafka and PostgreSQL. This page covers how to stand each one up in a configuration compatible with Core.
Apache Kafka
Kafka is NeuroSim's message bus. All plugin registration heartbeats, scenario orchestration messages, and scenario event traffic flow through Kafka topics. Kafka Tiered Storage is used to maintain complete senario message history while migrating that data to secondary storage over time.
NeuroSim Core will refuse to start if it cannot reach at least one Kafka broker.
Version Requirement
Apache Kafka version 3.5 or higher is recommended. The open source version of Kafka is perfectly acceptable, although sites may also elect to use Confluent Platform or Confluent Cloud to get support or higher level features. NeuroSim Core works with either.
Quick Start with Docker
The fastest way to run Kafka for evaluation or development purposes is to run local containers with Docker Compose. This uses the same images as the NeuroSim Core development environment:
services:
zookeeper:
image: confluentinc/cp-zookeeper:7.5.0
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
ports:
- "2181:2181"
kafka:
image: confluentinc/cp-kafka:7.5.0
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:29092,PLAINTEXT_HOST://0.0.0.0:9092
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
Note: The single-broker configuration above sets
REPLICATION_FACTOR: 1, which is appropriate for development only. See Production Considerations below.
Topic Configuration
NeuroSim Core will automatically create any needed Kafka topics if they don't exist, but the plugin registration topic should be configured with log compaction so that a restarting NeuroSim Core receives the current registered state without replaying all history.
Create the registration topic with log compaction before starting Core:
kafka-topics.sh --bootstrap-server localhost:9092 \
--create \
--topic neurosim-plugins \
--partitions 1 \
--replication-factor 1 \
--config cleanup.policy=compact \
--config min.cleanable.dirty.ratio=0.01 \
--config segment.ms=10000
Any other topics used by NeuroSim do not require special configuration and will be created automatically with defaults if they don't exist. Topic auto-creation must be enabled for this to work properly.
Production Considerations
Production deployments should use multi-broker Kafka clusters to ensure high availability (HA) and scalability as NeuroSim scenario runs increase in size and message load.
NeuroSim Core maintains scenario run message history in the same Kafka topic that acted as the data plane for the run. To keep Kafka performance at optimal levels, the messages in these topics should be configured to migrate to secondary storage using Kafka's Tiered Storage facility.
PostgreSQL
PostgreSQL is used to persist scenario definitions and run history attributes. Without it, this data exists only in memory and is lost when Core restarts.
If you only need Core for transient simulations and do not need historical run data, you can run with store_type: memory and skip PostgreSQL entirely.
Version Requirement
NeuroSim Core uses PostgreSQL JSONB columns and timestamptz — PostgreSQL 12 or later is required. NeuroSim uses
PostgreSQL version 16 in development and testing.
Creating the Database and User
Connect to PostgreSQL as a superuser and create a dedicated database and user:
CREATE USER neurosim WITH PASSWORD 'changeme';
CREATE DATABASE neurosim OWNER neurosim;
GRANT ALL PRIVILEGES ON DATABASE neurosim TO neurosim;
Replace changeme with a strong password. Store the credentials securely — you will need them for the Core configuration
file.
Schema Migration
NeuroSim Core automatically applies database migrations on startup. No manual schema setup is required — simply point Core at the database with a valid DSN and it will create the required tables on first run.
The schema creates two tables:
scenario_definitions— stores saved scenario configurationsscenario_runs— stores per-run state, timing, and event statistics
Quick Start with Docker
For evaluation or development:
docker run -d \
--name neurosim-postgres \
-e POSTGRES_USER=neurosim \
-e POSTGRES_PASSWORD=changeme \
-e POSTGRES_DB=neurosim \
-p 5432:5432 \
postgres:16
Production Considerations
Standard database administration practices are critical to maintaining data security and loss avoidance. Such advice is beyond the scope of this guide.