dragonpilot - 基於 openpilot 的開源駕駛輔助系統
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

67 lines
2.0 KiB

#!/usr/bin/env python3
import os
from datetime import datetime, timedelta
from functools import lru_cache
from pathlib import Path
from typing import IO, Union
DATA_CI_ACCOUNT = "commadataci"
DATA_CI_ACCOUNT_URL = f"https://{DATA_CI_ACCOUNT}.blob.core.windows.net"
OPENPILOT_CI_CONTAINER = "openpilotci"
DATA_CI_CONTAINER = "commadataci"
BASE_URL = f"{DATA_CI_ACCOUNT_URL}/{OPENPILOT_CI_CONTAINER}/"
TOKEN_PATH = Path("/data/azure_token")
process replay: automatically push refs on fail (#24414) * test failure() * let's just change a tune here * debug revert * debug * use current commit, not ref_commit fix * need to figure out better place for this * quick test * test without upload * temp * use azure token * fixes * shouldn't need this * debug * debug * not getting anything? * does this mean nothing gets envvars? * add azure token to docker environment variables * quote * move back * clean up a bit * more clean up * like this sorting better * replace flags with CI and clean up * test FULL_TEST and minimalize diff a bit * now test all * revert tests comments * remove flags * revert this revert this * now make it fail * now update ref_commit to last commit (make sure we can re-start this test if we commit before last one finishes uploading) * fix fix fix fix * bad commit * why is it not throwing an exception? * debug * URLFile returns empty bytes if using cache and remote file doesn't exist * we always need to download anyway * debug... * duh, wrong file. but neither should have it * add that back and just check explicitly * check both * clean up and make a diff * stylize * see if this is a better diff on files changed * update refs * revert changes * only for owners or members * if we have token access * if we have token access * if we have token access * move up * clean up * revert * move update refs to test_processes * clean up * update messages * update msg * update README and delete update_refs * this isn't possible to reach anymore * fix readme * better help message better help message better help message * only show basename when uploading, only if failed to find * test diff * fix printing old ref commit * change to using * update refs * Revert "update refs" This reverts commit 2e352a736a6de68e2c7064daa4e2e9409ce77686. * revert * ref refers to reference commit/logs, cur refers to current logs/commit (future ref) * like for better * Apply suggestions from code review Co-authored-by: Adeeb Shihadeh <adeebshihadeh@gmail.com> * Update selfdrive/test/process_replay/test_processes.py * every time lol Co-authored-by: Adeeb Shihadeh <adeebshihadeh@gmail.com>
3 years ago
def get_url(route_name: str, segment_num, log_type="rlog") -> str:
bigmodel (#23684) * Added wide cam vipc client and bigmodel transform logic * Added wide_frame to ModelState, should still work normally * Refactored image input into addImage method, should still work normally * Updated thneed/compile.cc * Bigmodel, untested: 44f83118-b375-4d4c-ae12-2017124f0cf4/200 * Have to initialize extra buffer in SNPEModel * Default paramater value in the wrong place I think * Move USE_EXTRA to SConscript * New model: 6c34d59a-acc3-4877-84bd-904c10745ba6/250 * move use extra check to runtime, not on C2 * this is always true * more C2 checks * log if frames are out of sync * more logging on no frame * store in pointer * print sof * add sync logic * log based on sof difference as well * keep both models * less assumptions * define above thneed * typo * simplify * no need for second client is main is already wide * more comments update * no optional reference * more logging to debug lags * add to release files * both defines * New model: 6831a77f-2574-4bfb-8077-79b0972a2771/950 * Path offset no longer relevant * Remove duplicate execute * Moved bigmodel back to big_supercombo.dlc * add wide vipc stream * Tici must be tici * Needs state too * add wide cam support to model replay * handle syncing better * ugh, c2 * print that * handle ecam lag * skip first one * so close * update refs Co-authored-by: mitchellgoffpc <mitchellgoffpc@gmail.com> Co-authored-by: Harald Schafer <harald.the.engineer@gmail.com> Co-authored-by: Adeeb Shihadeh <adeebshihadeh@gmail.com> Co-authored-by: Comma Device <device@comma.ai>
3 years ago
ext = "hevc" if log_type.endswith('camera') else "bz2"
return BASE_URL + f"{route_name.replace('|', '/')}/{segment_num}/{log_type}.{ext}"
@lru_cache
def get_azure_credential():
if "AZURE_TOKEN" in os.environ:
return os.environ["AZURE_TOKEN"]
elif TOKEN_PATH.is_file():
return TOKEN_PATH.read_text().strip()
else:
from azure.identity import AzureCliCredential
return AzureCliCredential()
@lru_cache
def get_container_sas(account_name: str, container_name: str):
from azure.storage.blob import BlobServiceClient, ContainerSasPermissions, generate_container_sas
start_time = datetime.utcnow()
expiry_time = start_time + timedelta(hours=1)
blob_service = BlobServiceClient(
account_url=f"https://{account_name}.blob.core.windows.net",
credential=get_azure_credential(),
)
return generate_container_sas(
account_name,
container_name,
user_delegation_key=blob_service.get_user_delegation_key(start_time, expiry_time),
permission=ContainerSasPermissions(read=True, write=True, list=True),
expiry=expiry_time,
)
def upload_bytes(data: Union[bytes, IO], blob_name: str) -> str:
from azure.storage.blob import BlobClient
blob = BlobClient(
account_url=DATA_CI_ACCOUNT_URL,
container_name=OPENPILOT_CI_CONTAINER,
blob_name=blob_name,
credential=get_azure_credential(),
overwrite=False,
)
blob.upload_blob(data)
return BASE_URL + blob_name
process replay: automatically push refs on fail (#24414) * test failure() * let's just change a tune here * debug revert * debug * use current commit, not ref_commit fix * need to figure out better place for this * quick test * test without upload * temp * use azure token * fixes * shouldn't need this * debug * debug * not getting anything? * does this mean nothing gets envvars? * add azure token to docker environment variables * quote * move back * clean up a bit * more clean up * like this sorting better * replace flags with CI and clean up * test FULL_TEST and minimalize diff a bit * now test all * revert tests comments * remove flags * revert this revert this * now make it fail * now update ref_commit to last commit (make sure we can re-start this test if we commit before last one finishes uploading) * fix fix fix fix * bad commit * why is it not throwing an exception? * debug * URLFile returns empty bytes if using cache and remote file doesn't exist * we always need to download anyway * debug... * duh, wrong file. but neither should have it * add that back and just check explicitly * check both * clean up and make a diff * stylize * see if this is a better diff on files changed * update refs * revert changes * only for owners or members * if we have token access * if we have token access * if we have token access * move up * clean up * revert * move update refs to test_processes * clean up * update messages * update msg * update README and delete update_refs * this isn't possible to reach anymore * fix readme * better help message better help message better help message * only show basename when uploading, only if failed to find * test diff * fix printing old ref commit * change to using * update refs * Revert "update refs" This reverts commit 2e352a736a6de68e2c7064daa4e2e9409ce77686. * revert * ref refers to reference commit/logs, cur refers to current logs/commit (future ref) * like for better * Apply suggestions from code review Co-authored-by: Adeeb Shihadeh <adeebshihadeh@gmail.com> * Update selfdrive/test/process_replay/test_processes.py * every time lol Co-authored-by: Adeeb Shihadeh <adeebshihadeh@gmail.com>
3 years ago
def upload_file(path: Union[str, os.PathLike], blob_name: str) -> str:
with open(path, "rb") as f:
return upload_bytes(f, blob_name)