Warning: This document is for an old version of Rasa Core. The latest version is 0.14.5.

Agent

The agent allows you to train a model, load, and use it. It is a facade to access most of Rasa Core’s functionality using a simple API.

Note

Not all functionality is exposed through methods on the agent. Sometimes you need to orchestrate the different components (domain, policy, interpreter, and the tracker store) on your own to customize them.

Here we go:

class rasa_core.agent.Agent(domain: Union[str, rasa_core.domain.Domain] = None, policies: Union[rasa_core.policies.ensemble.PolicyEnsemble, List[rasa_core.policies.policy.Policy], None] = None, interpreter: Optional[rasa_core.interpreter.NaturalLanguageInterpreter] = None, generator: Union[rasa_core.utils.EndpointConfig, NLG, None] = None, tracker_store: Optional[TrackerStore] = None, action_endpoint: Optional[rasa_core.utils.EndpointConfig] = None, fingerprint: Optional[str] = None)[source]

Bases: object

The Agent class provides a convenient interface for the most important Rasa Core functionality.

This includes training, handling messages, loading a dialogue model, getting the next action, and handling a channel.

continue_training(trackers: List[rasa_core.trackers.DialogueStateTracker], **kwargs) → None[source]
create_processor(preprocessor: Optional[Callable[str, str]] = None) → rasa_core.processor.MessageProcessor[source]

Instantiates a processor based on the set state of the agent.

static create_tracker_store(store: Optional[TrackerStore], domain: rasa_core.domain.Domain) → TrackerStore[source]
execute_action(sender_id: str, action: str, output_channel: rasa_core.channels.channel.OutputChannel, policy: str, confidence: float) → rasa_core.trackers.DialogueStateTracker[source]

Handle a single message.

handle_channels(channels: List[rasa_core.channels.channel.InputChannel], http_port: int = 5005, serve_forever: bool = True, route: str = '/webhooks/') → gevent.pywsgi.WSGIServer[source]

Start a webserver attaching the input channels and handling msgs.

If serve_forever is set to True, this call will be blocking. Otherwise the webserver will be started, and the method will return afterwards.

handle_message(message: rasa_core.channels.channel.UserMessage, message_preprocessor: Optional[Callable[str, str]] = None, **kwargs) → Optional[List[str]][source]

Handle a single message.

handle_text(text_message: Union[str, Dict[str, Any]], message_preprocessor: Optional[Callable[str, str]] = None, output_channel: Optional[rasa_core.channels.channel.OutputChannel] = None, sender_id: Optional[str] = 'default') → Optional[List[Dict[str, Any]]][source]

Handle a single message.

If a message preprocessor is passed, the message will be passed to that function first and the return value is then used as the input for the dialogue engine.

The return value of this function depends on the output_channel. If the output channel is not set, set to None, or set to CollectingOutputChannel this function will return the messages the bot wants to respond.

Example:
>>> from rasa_core.agent import Agent
>>> from rasa_core.interpreter import RasaNLUInterpreter
>>> interpreter = RasaNLUInterpreter(
... "examples/restaurantbot/models/nlu/current")
>>> agent = Agent.load("examples/restaurantbot/models/dialogue",
... interpreter=interpreter)
>>> agent.handle_text("hello")
[u'how can I help you?']
is_ready()[source]

Check if all necessary components are instantiated to use agent.

classmethod load(path: str, interpreter: Optional[rasa_core.interpreter.NaturalLanguageInterpreter] = None, generator: Union[rasa_core.utils.EndpointConfig, NLG] = None, tracker_store: Optional[TrackerStore] = None, action_endpoint: Optional[rasa_core.utils.EndpointConfig] = None) → Agent[source]

Load a persisted model from the passed path.

load_data(resource_name: str, remove_duplicates: bool = True, unique_last_num_states: Optional[int] = None, augmentation_factor: int = 20, tracker_limit: Optional[int] = None, use_story_concatenation: bool = True, debug_plots: bool = False, exclusion_percentage: int = None) → List[rasa_core.trackers.DialogueStateTracker][source]

Load training data from a resource.

log_message(message: rasa_core.channels.channel.UserMessage, message_preprocessor: Optional[Callable[str, str]] = None, **kwargs) → rasa_core.trackers.DialogueStateTracker[source]

Append a message to a dialogue - does not predict actions.

persist(model_path: str, dump_flattened_stories: bool = False) → None[source]

Persists this agent into a directory for later loading and usage.

predict_next(sender_id: str, **kwargs) → Dict[str, Any][source]

Handle a single message.

toggle_memoization(activate: bool) → None[source]

Toggles the memoization on and off.

If a memoization policy is present in the ensemble, this will toggle the prediction of that policy. When set to False the Memoization policies present in the policy ensemble will not make any predictions. Hence, the prediction result from the ensemble always needs to come from a different policy (e.g. KerasPolicy). Useful to test prediction capabilities of an ensemble when ignoring memorized turns from the training data.

train(training_trackers: List[rasa_core.trackers.DialogueStateTracker], **kwargs) → None[source]

Train the policies / policy ensemble using dialogue data from file.

Args:

training_trackers: trackers to train on **kwargs: additional arguments passed to the underlying ML

trainer (e.g. keras parameters)
update_model(domain: Union[str, rasa_core.domain.Domain], policy_ensemble: rasa_core.policies.ensemble.PolicyEnsemble, fingerprint: Optional[str], interpreter: Optional[rasa_core.interpreter.NaturalLanguageInterpreter] = None) → None[source]
visualize(resource_name: str, output_file: str, max_history: Optional[int] = None, nlu_training_data: Optional[str] = None, should_merge_nodes: bool = True, fontsize: int = 12) → None[source]
rasa_core.agent.load_from_server(interpreter: Optional[rasa_core.interpreter.NaturalLanguageInterpreter] = None, generator: Union[rasa_core.utils.EndpointConfig, NLG, None] = None, tracker_store: Optional[TrackerStore] = None, action_endpoint: Optional[rasa_core.utils.EndpointConfig] = None, model_server: Optional[rasa_core.utils.EndpointConfig] = None) → Agent[source]

Load a persisted model from a server.

rasa_core.agent.start_model_pulling_in_worker(model_server: rasa_core.utils.EndpointConfig, wait_time_between_pulls: int, agent: rasa_core.agent.Agent) → None[source]