Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: Revise job schemas #43

Merged
merged 41 commits into from
Nov 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
ef6f770
task -> job
forest1040 Oct 9, 2024
18180b0
jobs
forest1040 Oct 9, 2024
cf7ed57
integration jobs.yaml and results.yaml
forest1040 Oct 10, 2024
7ffe9a9
integration jobs.yaml and results.yaml
forest1040 Oct 10, 2024
d15bba7
change snake_case yaml properties and parameters
forest1040 Oct 10, 2024
d94cb23
schema change
forest1040 Oct 10, 2024
377ee8b
db schema change
forest1040 Oct 10, 2024
5bfc2eb
init device data
forest1040 Oct 10, 2024
bf92733
add init data jobs
forest1040 Oct 10, 2024
415bc0b
modify init.sql
forest1040 Oct 10, 2024
9b1f150
typo note
forest1040 Oct 10, 2024
b5767a1
remove calibration_data
forest1040 Oct 10, 2024
3475499
change device api routes
forest1040 Oct 10, 2024
c9012af
change pending_tasks to pending_jobs
forest1040 Oct 10, 2024
ee63818
change device list api
forest1040 Oct 10, 2024
4acde84
restart_at -> available_at
forest1040 Oct 14, 2024
0fe8b02
deviceId -> device_id
forest1040 Oct 14, 2024
9d4e74a
refactor task,result to job
forest1040 Oct 14, 2024
afd739b
jobs router
forest1040 Oct 14, 2024
79c2fa1
get jobs
forest1040 Oct 14, 2024
1fdfc6a
job status
forest1040 Oct 14, 2024
d6c249a
job cancel
forest1040 Oct 14, 2024
f8e2ca3
end of summer camp
forest1040 Oct 14, 2024
5ffaeb4
job provider
forest1040 Oct 14, 2024
2e7802f
end of summer camp
forest1040 Oct 14, 2024
9f3f26e
Merge branch 'jobs-json' of github.com:oqtopus-team/oqtopus-cloud int…
katsujukou Oct 17, 2024
8084f22
Modify userAPIs' router
katsujukou Oct 24, 2024
f693f1e
Make jobs.job_info as discriminated union.
katsujukou Oct 24, 2024
374bcd8
datamodel code generator custom template
katsujukou Oct 24, 2024
28b7527
[backend][OAS] Use literal type for singleton enum.
katsujukou Oct 25, 2024
e759ac1
Update db/init/01.schema.sql
katsujukou Oct 25, 2024
7b4f570
Merge branch 'develop' into jobs-json
katsujukou Oct 28, 2024
dc3be4b
Refine user api job schema definition
katsujukou Nov 6, 2024
aa90319
rename oas JobFullDef to JobDef
katsujukou Nov 7, 2024
23bfa14
[provider] modify JobDef schema
katsujukou Nov 8, 2024
502fd11
[provider] Job schema
katsujukou Nov 8, 2024
19efb6e
[backend] rename task -> job
katsujukou Nov 11, 2024
3b10dc6
Add Tests
katsujukou Nov 11, 2024
c13ac26
Add tests for provider
katsujukou Nov 11, 2024
249ee06
[backend][OAS] modify resource properties naming according to style g…
katsujukou Nov 12, 2024
aacff1e
[backend][OAS] modify resource properties naming according to style g…
katsujukou Nov 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions backend/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,12 @@ generate:
--use-field-description \
--input ${INPUT} \
--input-file-type openapi \
--collapse-root-models \
--enum-field-as-literal one \
--use-union-operator \
--use-subclass-enum \
--custom-file-header-path "oqtopus_cloud/datamodel_codegen/datamodel-codegen-template.j2" \
--output-model-type pydantic_v2.BaseModel \
--enum-field-as-literal all \
--disable-timestamp \
--use-standard-collections \
--strict-nullable \
Expand Down Expand Up @@ -80,8 +84,8 @@ lint-common: ## Run common linters
@poetry run mypy -p oqtopus_cloud.common

lint-user: ## Run user linters
@poetry run ruff check oqtopus_cloud/user
@poetry run mypy -p oqtopus_cloud.user
@poetry run ruff check oqtopus_cloud/user

lint-provider: ## Run provider linters
@poetry run ruff check oqtopus_cloud/provider
Expand Down
91 changes: 25 additions & 66 deletions backend/db/init/01.schema.sql
Original file line number Diff line number Diff line change
@@ -1,75 +1,34 @@
drop table if exists main.devices;
CREATE TABLE IF NOT EXISTS main.devices (
id VARCHAR(64) PRIMARY KEY,
device_type ENUM ('QPU', 'simulator') NOT NULL,
status ENUM ('AVAILABLE', 'NOT_AVAILABLE') DEFAULT 'NOT_AVAILABLE' NOT NULL,
restart_at DATETIME,
pending_tasks INT DEFAULT 0 NOT NULL,
n_qubits INT NOT NULL,
n_nodes INT,
device_type VARCHAR(32) DEFAULT 'QPU' NOT NULL,
status VARCHAR(64) DEFAULT 'available' NOT NULL,
available_at DATETIME,
pending_jobs INT DEFAULT 0 NOT NULL,
n_qubits INT DEFAULT 1 NOT NULL,
basis_gates VARCHAR(256) NOT NULL,
instructions VARCHAR(64) NOT NULL,
calibration_data TEXT,
device_info TEXT,
calibrated_at DATETIME,
description VARCHAR(128) NOT NULL
description VARCHAR(128) NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);

-- INSERT INTO main.devices (id, device_type, n_qubits, n_nodes, basis_gates, instructions, description)
-- SELECT 'SC', 'QPU', 64, NULL, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', 'Superconducting quantum computer'
-- WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'SC');

-- INSERT INTO main.devices (id, device_type, n_qubits, n_nodes, basis_gates, instructions, description)
-- SELECT 'SVSim', 'simulator', 39, 512, '["x", "y", "z", "h", "s", "sdg", "t", "tdg", "rx", "ry", "rz", "cx", "cz", "swap", "u1", "u2", "u3", "u", "p", "id", "sx", "sxdg"]', '["measure", "barrier", "reset"]', 'State vector-based quantum circuit simulator'
-- WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'SVSim');


CREATE TABLE IF NOT EXISTS main.tasks (
id VARBINARY(16) PRIMARY KEY,
drop table if exists main.jobs;
CREATE TABLE IF NOT EXISTS main.jobs (
id VARCHAR(64) PRIMARY KEY,
owner VARCHAR(64) NOT NULL,
name varchar(256),
device VARCHAR(64) NOT NULL,
n_qubits INT,
n_nodes INT,
code TEXT NOT NULL,
action ENUM ('sampling', 'estimation') NOT NULL,
method ENUM ('state_vector', 'sampling'),
shots INT,
operator VARCHAR(1024),
qubit_allocation TEXT,
skip_transpilation BOOLEAN DEFAULT false NOT NULL,
seed_transpilation INT,
seed_simulation INT,
n_per_node INT UNSIGNED,
simulation_opt text,
ro_error_mitigation enum('none', 'pseudo_inverse', 'least_square'),
note VARCHAR(1024),
status ENUM ('QUEUED', 'RUNNING', 'COMPLETED', 'FAILED', 'CANCELLING', 'CANCELLED') NOT NULL DEFAULT 'QUEUED',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (device) REFERENCES devices(id)
name varchar(256) DEFAULT '' NOT NULL,
description VARCHAR(1024),
device_id VARCHAR(64) NOT NULL,
job_info TEXT,
transpiler_info TEXT,
simulator_info TEXT,
mitigation_info TEXT,
job_type VARCHAR(32) DEFAULT 'sampling' NOT NULL,
shots INT DEFAULT 1000 NOT NULL,
status VARCHAR(32) DEFAULT 'submitted' NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);

CREATE TABLE IF NOT EXISTS main.results (
task_id VARBINARY(16) PRIMARY KEY,
status ENUM('SUCCESS', 'FAILURE', 'CANCELLED') NOT NULL,
result TEXT,
reason TEXT,
transpiled_code TEXT,
qubit_allocation TEXT,
FOREIGN KEY (task_id) REFERENCES tasks(id) ON DELETE CASCADE
);

delimiter $$

CREATE TRIGGER main.update_tasks_status_trigger
AFTER INSERT ON main.results
FOR EACH ROW
BEGIN
IF NEW.status = 'SUCCESS' THEN
UPDATE main.tasks SET status = 'COMPLETED' WHERE id = NEW.task_id;
ELSEIF NEW.status = 'FAILURE' THEN
UPDATE main.tasks SET status = 'FAILED' WHERE id = NEW.task_id;
ELSEIF NEW.status = 'CANCELLED' THEN
UPDATE main.tasks SET status = 'CANCELLED' WHERE id = NEW.task_id;
END IF;
END$$

delimiter ;
18 changes: 9 additions & 9 deletions backend/db/init/02.insert_device.sql
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
INSERT INTO main.devices (id, device_type,status, n_qubits, n_nodes, basis_gates, instructions, description)
SELECT 'SC', 'QPU','AVAILABLE', 64, NULL, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', 'Superconducting quantum computer'
INSERT INTO main.devices (id, device_type, status, available_at, pending_jobs, n_qubits, basis_gates, instructions, device_info, calibrated_at, description)
SELECT 'SC', 'QPU','available', CURRENT_TIMESTAMP, 9, 64, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', '', CURRENT_TIMESTAMP, 'Superconducting quantum computer'
WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'SC');

INSERT INTO main.devices (id, device_type, status, n_qubits, n_nodes, basis_gates, instructions, description)
SELECT 'SVSim', 'simulator','AVAILABLE', 39, 512, '["x", "y", "z", "h", "s", "sdg", "t", "tdg", "rx", "ry", "rz", "cx", "cz", "swap", "u1", "u2", "u3", "u", "p", "id", "sx", "sxdg"]', '["measure", "barrier", "reset"]', 'State vector-based quantum circuit simulator'
INSERT INTO main.devices (id, device_type, status, available_at, pending_jobs, n_qubits, basis_gates, instructions, device_info, calibrated_at, description)
SELECT 'SVSim', 'simulator','available', CURRENT_TIMESTAMP, 0, 39, '["x", "y", "z", "h", "s", "sdg", "t", "tdg", "rx", "ry", "rz", "cx", "cz", "swap", "u1", "u2", "u3", "u", "p", "id", "sx", "sxdg"]', '["measure", "barrier", "reset"]', '', CURRENT_TIMESTAMP, 'State vector-based quantum circuit simulator'
WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'SVSim');

INSERT INTO main.devices (id, device_type, status, n_qubits, n_nodes, basis_gates, instructions, description)
SELECT 'Kawasaki', 'QPU','AVAILABLE', 64, NULL, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', 'Superconducting quantum computer'
INSERT INTO main.devices (id, device_type, status, available_at, pending_jobs, n_qubits, basis_gates, instructions, device_info, calibrated_at, description)
SELECT 'Kawasaki', 'QPU','available', CURRENT_TIMESTAMP, 2, 64, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', '', CURRENT_TIMESTAMP, 'Superconducting quantum computer'
WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'Kawasaki');

INSERT INTO main.devices (id, device_type, n_qubits, n_nodes, basis_gates, instructions, description)
SELECT 'test1', 'QPU', 64, NULL, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', 'Superconducting quantum computer'
WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = 'test');
INSERT INTO main.devices (id, device_type, status, available_at, pending_jobs, n_qubits, basis_gates, instructions, device_info, calibrated_at, description)
SELECT '01927422-86d4-7597-b724-b08a5e7781fc', 'QPU','unavailable', CURRENT_TIMESTAMP, 0, 64, '["sx", "rz", "rzx90", "id"]', '["measure", "barrier"]', '', CURRENT_TIMESTAMP, 'Superconducting quantum computer'
WHERE NOT EXISTS (SELECT * FROM main.devices WHERE id = '01927422-86d4-7597-b724-b08a5e7781fc');
8 changes: 8 additions & 0 deletions backend/db/init/03.insert_job.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
INSERT INTO main.jobs (id, owner, name, description, device_id, job_info, transpiler_info, simulator_info, mitigation_info, job_type, shots, status)
SELECT '01927422-86d4-73d6-abb4-f2de6a4f5910', 'admin', 'Test job 1', 'Test job 1 description', 'Kawasaki', '{\'code\': \'todo\'}', '', '', '', 'sampling', 1000, 'submitted'
WHERE NOT EXISTS (SELECT * FROM main.jobs WHERE owner = 'admin' AND name = 'Test job 1');

INSERT INTO main.jobs (id, owner, name, description, device_id, job_info, transpiler_info, simulator_info, mitigation_info, job_type, shots, status)
SELECT '01927422-86d4-7cbf-98d3-32f5f1263cd9', 'admin', 'Test job 2', 'Test job 2 description', 'Kawasaki', '{\'code\': \'todo\'}', '', '', '', 'sampling', 1000, 'submitted'
WHERE NOT EXISTS (SELECT * FROM main.jobs WHERE owner = 'admin' AND name = 'Test job 2');

7 changes: 0 additions & 7 deletions backend/db/init/03.insert_task.sql

This file was deleted.

26 changes: 13 additions & 13 deletions backend/oas/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

- ファイル名は、原則`{tag}.{FooBar}.yaml`の形式とし、FooBarはキャメルケースとする
- ファイル名の`tag`は、APIのリソース名とする
- ファイル名の`FooBar`は、後続するAPIのメソッド名とする(例: `task/{task_id}`→`task.TaskId.yaml`)
- ファイル名の`FooBar`は、後続するAPIのメソッド名とする(例: `job/{job_id}`→`job.JobId.yaml`)
- `datamodel-code-generator`は、`{tag}.{FooBar}.yaml`の形式のファイルを読み込むため、`tag`は、APIのリソース名とし、`FooBar`は、自動生成されるコードのクラス名とする

#### 例1.1
Expand All @@ -16,21 +16,21 @@ tree
.
├── openapi.yaml
├── parameters
│ └── task.TaskId.yaml
│ └── job.JobId.yaml
├── paths
│ ├── hello.yaml
│ ├── task.TaskId.yaml
│ └── task.yaml
│ ├── job.JobId.yaml
│ └── job.yaml
├── root.yaml
└── schemas
├── error.BadRequest.yaml
├── error.InternalServerError.yaml
├── error.NotFoundError.yaml
├── hello.HelloWorldResponse.yaml
├── task.GetTaskResponse.yaml
├── task.PostTaskRequest.yaml
├── task.PostTaskResponse.yaml
└── task.TaskId.yaml
├── job.GetJobResponse.yaml
├── job.PostJobRequest.yaml
├── job.PostJobResponse.yaml
└── job.JobId.yaml
```

### 1.2 スキーマの命名規則
Expand All @@ -41,12 +41,12 @@ tree
#### 例1.2

```yaml
# schemas/task.GetTaskResponse.yaml
description: detail of task response
# schemas/job.GetJobResponse.yaml
description: detail of job response
type: object
properties:
task_id:
$ref: "task.TaskId.yaml"
job_id:
$ref: "job.JobId.yaml"
code: {type: string, example: "OPENQASM 3; qubit[2] q; bit[2] c; h q[0]; cnot q[0], q[1]; c = measure q;"}
device: {type: string, example: "Kawasaki"}
n_qubits:
Expand Down Expand Up @@ -94,7 +94,7 @@ properties:
optimization_swap_level: 1
}
required: [
taskId, code, device, skip_transpilation
jobId, code, device, skip_transpilation
]

```
Expand Down
Loading
Loading