Advanced AI Workflows
ai_execute
enables the orchestration of complex, multi-step workflows by dynamically interpreting natural language inputs and intelligently coordinating API calls. By leveraging multiple language models (LLMs), callback functions, and custom logic overrides, ai_execute
offers extensive control over adaptive workflows in real-time.
In this section, we'll explore how to:
- Chain APIs and incorporate AI-driven logic for sophisticated orchestrations.
- Fine-tune workflows with custom hooks, callbacks, and asynchronous execution.
Multi-API Orchestration with AI-Driven Logic
The core strength of ai_execute
lies in its ability to process a single input and interpret the sequence of required actions across multiple APIs. This AI-driven approach allows for workflows that adapt to context, providing solutions that can scale and evolve without manual configuration.
Key Capabilities:
- Dynamic AI Orchestration: AI interprets user input to determine which APIs to call and in what sequence.
- Multi-LLM Compatibility: AI decisions are based on insights from various LLMs, allowing contextual adaptations and accurate processing.
- Multi-API Workflows: Coordinate actions across CRM, email, calendar, and more, all within a single workflow.
- Context-Aware Execution: Real-time responses and memory configurations allow for workflows that adapt based on historical data or current session information.
- Python
Example: Coordinated Multi-API Workflow with Contextual Decisions
Suppose you need the AI to decide whether to send an email, update a CRM, or schedule a meeting based on user instructions. Using ai_execute
, a single command can trigger multiple actions while the AI interprets the most efficient order and logic for each task.
from fiscus import FiscusClient, FiscusLLMType
client = FiscusClient(api_key="YOUR_FISCUS_API_KEY")
# Define an adaptive query for a multi-API scenario
query = "Notify John about tomorrow's meeting, update his CRM record, and add the event to the calendar."
response = client.ai_execute(
input=query,
llm_type=FiscusLLMType.OPENAI,
execution_mode='SEQUENTIAL'
)
if response.success:
print(f"AI orchestrated tasks successfully: {response.result}")
else:
print(f"Workflow error: {response.error_message}")
Workflow Breakdown:
In this scenario, ai_execute
might:
- Use the Email API to notify John of the meeting.
- Update John’s contact record in CRM.
- Add the meeting to Calendar API.
With ai_execute
, these API calls are seamlessly coordinated without explicit, hardcoded API calls. The AI dynamically interprets the input, orchestrates the tasks, and manages the dependencies.
Example Workflow Structure:
- Input: "Notify John about tomorrow's meeting, update CRM, and add to the calendar."
- AI interprets:
- Step 1: Uses the Email API.
- Step 2: Updates the CRM.
- Step 3: Schedules the event on Calendar API.
- Output: Consolidated results indicating each task's successful completion.
Customizing Workflows with Hooks, Callbacks, and Execution Modes
For advanced control, ai_execute
includes customizable pre- and post-execution hooks and specialized callbacks for handling errors, streaming, or other real-time conditions. Hooks allow you to modify parameters or inspect responses at various stages, while execution modes provide control over task sequences and conditional flows.
Core Benefits:
- Execution Modes: Choose
SEQUENTIAL
for ordered execution ordecision_logic_override
for branching and conditional workflows. - Pre-Execution Hooks: Inspect and adjust parameters before API execution.
- Post-Execution Hooks: Process, log, or adjust responses after API completion.
- Advanced Callbacks: Error handling, real-time streaming, and decision callbacks provide robust control over workflow execution.
Example: Workflow with Pre- and Post-Execution Hooks
Imagine a scenario where you need to log API call parameters before execution and analyze responses afterward. Setting hooks provides this customizability.
from fiscus import FiscusClient, FiscusLLMType
client = FiscusClient(api_key="YOUR_FISCUS_API_KEY")
# Pre-execution hook to log API call details
def pre_execution_hook(connector_name, operation_name, params):
print(f"Preparing {operation_name} on {connector_name} with: {params}")
return params
# Post-execution hook to log API response
def post_execution_hook(response):
if response.success:
print(f"Executed operation successfully. Result: {response.result}")
else:
print(f"Operation error: {response.error_message}")
return response
# Set hooks in the client
client.set_pre_execution_hook(pre_execution_hook)
client.set_post_execution_hook(post_execution_hook)
query = "Send meeting details to John and log it in CRM."
response = client.ai_execute(input=query, llm_type=FiscusLLMType.ANTHROPIC)
if response.success:
print("Workflow with custom hooks executed successfully.")
else:
print(f"Execution error: {response.error_message}")
Example: Conditional Logic with decision_logic_override
The decision_logic_override
parameter allows dynamic decision-making within workflows, where AI selects steps based on conditions or results from previous tasks.
def custom_decision_logic(input_text):
# Define decisions based on input patterns or task outcomes
return [
{'connector': 'CalendarAPI', 'operation': 'create_event'},
{'connector': 'CRMService', 'operation': 'update_contact'},
{'connector': 'EmailService', 'operation': 'send_notification'}
]
response = client.ai_execute(
input="Set a follow-up meeting and send email to the team.",
llm_type=FiscusLLMType.GEMINI,
decision_logic_override=custom_decision_logic
)
if response.success:
print("Dynamic workflow completed:", response.result)
else:
print("Error:", response.error_message)
Asynchronous Workflows with ai_execute_async
For non-blocking, real-time tasks, ai_execute_async
handles workflows asynchronously, enabling concurrent task processing. This feature is useful for high-traffic or multi-threaded applications.
Example: Async Workflow with ai_execute_async
import asyncio
async def run_async_workflow():
response = await client.ai_execute_async(
input="Summarize and email this week’s completed tasks to the team.",
llm_type=FiscusLLMType.ANTHROPIC
)
if response.success:
print("Async workflow completed:", response.result)
else:
print("Async error:", response.error_message)
asyncio.run(run_async_workflow())
Summary of Advanced Workflow Capabilities
With ai_execute
, you can:
- Combine Multiple APIs: Coordinate several APIs in a single flow, guided by AI-driven decisions.
- Customize with Hooks and Callbacks: Adapt workflows by modifying parameters, handling errors, and logging responses.
- Utilize Advanced Execution Modes: Choose SEQUENTIAL or conditional logic modes to suit your workflow needs.
- Implement Real-Time Async Processing: Enable asynchronous execution to handle high-traffic or time-sensitive workflows.
Using ai_execute
, Fiscus SDK users can build adaptive, powerful integrations that respond dynamically to real-time needs, reducing manual management and enhancing the flexibility of API orchestration.