enh_inference_streaming.py
Less than 1 minute
enh_inference_streaming.py
Frontend inference
usage: enh_inference_streaming.py [-h] [--config CONFIG] [--log_level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}] --output_dir OUTPUT_DIR
[--ngpu NGPU] [--seed SEED] [--dtype {float16,float32,float64}] [--fs FS] [--num_workers NUM_WORKERS]
--data_path_and_name_and_type DATA_PATH_AND_NAME_AND_TYPE [--key_file KEY_FILE]
[--allow_variable_data_keys ALLOW_VARIABLE_DATA_KEYS] [--output_format OUTPUT_FORMAT]
[--train_config TRAIN_CONFIG] [--model_file MODEL_FILE] [--model_tag MODEL_TAG]
[--inference_config INFERENCE_CONFIG] [--enh_s2t_task ENH_S2T_TASK] [--batch_size BATCH_SIZE]
[--ref_channel REF_CHANNEL]