Apollo  6.0
Open source self driving car software
model_inference.h
Go to the documentation of this file.
1 /******************************************************************************
2  * Copyright 2020 The Apollo Authors. All Rights Reserved.
3  *
4  * Licensed under the Apache License, Version 2.0 (the "License");
5  * you may not use this file except in compliance with the License.
6  * You may obtain a copy of the License at
7  *
8  * http://www.apache.org/licenses/LICENSE-2.0
9  *
10  * Unless required by applicable law or agreed to in writing, software
11  * distributed under the License is distributed on an "AS IS" BASIS,
12  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13  * See the License for the specific language governing permissions and
14  * limitations under the License.
15  *****************************************************************************/
16 
22 #pragma once
23 
24 #include <string>
25 
26 #include "modules/planning/proto/learning_data.pb.h"
27 #include "modules/planning/proto/task_config.pb.h"
28 
29 namespace apollo {
30 namespace planning {
31 
33  public:
37  explicit ModelInference(const LearningModelInferenceTaskConfig& config)
38  : config_(config) {}
39 
43  virtual ~ModelInference() = default;
44 
48  virtual std::string GetName() = 0;
49 
53  virtual bool LoadModel() = 0;
54 
58  virtual bool DoInference(LearningDataFrame* learning_data_frame) = 0;
59 
60  protected:
61  LearningModelInferenceTaskConfig config_;
62 };
63 
64 } // namespace planning
65 } // namespace apollo
PlanningContext is the runtime context in planning. It is persistent across multiple frames...
Definition: atomic_hash_map.h:25
ModelInference(const LearningModelInferenceTaskConfig &config)
Constructor.
Definition: model_inference.h:37
Definition: model_inference.h:32
Planning module main class. It processes GPS and IMU as input, to generate planning info...
virtual ~ModelInference()=default
Destructor.
virtual bool DoInference(LearningDataFrame *learning_data_frame)=0
inference a learned model
virtual std::string GetName()=0
Get the name of model inference.
LearningModelInferenceTaskConfig config_
Definition: model_inference.h:61
virtual bool LoadModel()=0
load a learned model