The CommunityRAPP system supports environment-aware storage that automatically adapts between Azure File Storage (for production) and local file system storage (for development).
This allows you to:
The system automatically detects whether it’s running in:
All components use get_storage_manager() from utils/storage_factory.py:
from utils.storage_factory import get_storage_manager
# Automatically returns the appropriate storage manager
storage = get_storage_manager()
When using local fallback, files are stored in:
.local_storage/
├── shared_memories/
│ └── memory.json
├── memory/
│ └── {user-guid}/
│ └── user_memory.json
├── agents/
│ └── custom_agent.py
├── demos/
│ └── demo_script.json
└── agent_config/
└── {user-guid}/
└── enabled_agents.json
This directory is automatically created and is excluded from git via .gitignore.
Setup:
local.settings.jsonBehavior:
.local_storage/ directoryGetting Started:
# 1. Install dependencies
pip install -r requirements.txt
# 2. Run the function app
./run.sh # Mac/Linux
# or
.\run.ps1 # Windows
# 3. Files are automatically created in .local_storage/
Setup:
local.settings.jsonBehavior:
Configuration:
{
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=...",
"AZURE_STORAGE_ACCOUNT_NAME": "your-storage-account",
"AZURE_FILES_SHARE_NAME": "your-share-name"
}
}
Setup:
Behavior:
WEBSITE_INSTANCE_IDThe system checks these environment variables (in order):
If any are present → Azure Environment
If none present, checks for Azure storage configuration:
AZURE_STORAGE_ACCOUNT_NAMEAZURE_FILES_SHARE_NAMEAzureWebJobsStorage (with AccountKey) OR Token auth credentialsIf fully configured → Uses Azure Storage If not configured → Uses Local Fallback
You can force local storage even with Azure credentials by setting:
export FORCE_LOCAL_STORAGE=true
Or in local.settings.json:
{
"Values": {
"FORCE_LOCAL_STORAGE": "true"
}
}
Both storage managers (AzureFileStorageManager and LocalFileStorageManager) implement the same interface:
storage = get_storage_manager()
# Set memory context
storage.set_memory_context(user_guid) # User-specific
storage.set_memory_context(None) # Shared memory
# Read/Write JSON memory
data = storage.read_json()
storage.write_json({"key": "value"})
# Write file
storage.write_file('agents', 'my_agent.py', agent_code)
# Read file
content = storage.read_file('agents', 'my_agent.py')
# List files
files = storage.list_files('agents')
for file in files:
print(file.name)
# Check existence
exists = storage.file_exists('agents', 'my_agent.py')
# Delete file
storage.delete_file('agents', 'my_agent.py')
# Get properties
props = storage.get_file_properties('agents', 'my_agent.py')
print(props['size'], props['last_modified'])
# Create directory
storage.ensure_directory_exists('custom_agents')
# Create agent file
cat > .local_storage/agents/test_agent.py << 'EOF'
from agents.basic_agent import BasicAgent
class TestAgent(BasicAgent):
def __init__(self):
self.name = 'Test'
self.metadata = {
"name": self.name,
"description": "Test agent",
"parameters": {"type": "object", "properties": {}}
}
super().__init__(self.name, self.metadata)
def perform(self, **kwargs):
return "Test successful!"
EOF
# Restart function app to load new agent
import json
# Shared memory
shared = {
"memories": [
{
"id": "mem_001",
"type": "fact",
"content": "The company was founded in 2020",
"timestamp": "2025-01-15T10:00:00Z"
}
]
}
with open('.local_storage/shared_memories/memory.json', 'w') as f:
json.dump(shared, f, indent=4)
# Enable specific agents for a user GUID
mkdir -p .local_storage/agent_config/c0p110t0-aaaa-bbbb-cccc-123456789abc
cat > .local_storage/agent_config/c0p110t0-aaaa-bbbb-cccc-123456789abc/enabled_agents.json << 'EOF'
[
"context_memory_agent.py",
"manage_memory_agent.py",
"test_agent.py"
]
EOF
Cause: Storage factory not imported correctly
Solution:
# Wrong
from utils.azure_file_storage import AzureFileStorageManager
storage = AzureFileStorageManager()
# Correct
from utils.storage_factory import get_storage_manager
storage = get_storage_manager()
Cause: Directory permissions or path issues
Solution:
# Check directory exists and is writable
ls -la .local_storage/
chmod -R u+w .local_storage/
Cause: Token-based auth issues with key-based auth disabled
Solution:
For Local Development:
.local_storage/ exists and has correct permissionsFor Azure Deployment:
When moving from local development to Azure:
# Backup local storage
tar -czf local-backup.tar.gz .local_storage/
./deploy.sh # Deployment script
storage = get_storage_manager()
with open(‘my_agent.py’, ‘r’) as f: storage.write_file(‘agents’, ‘my_agent.py’, f.read())
### Azure to Local
To test Azure data locally:
1. **Download from Azure:**
```python
from utils.azure_file_storage import AzureFileStorageManager
import os
# Configure Azure credentials
os.environ['AzureWebJobsStorage'] = 'your-connection-string'
azure_storage = AzureFileStorageManager()
# Download agents
files = azure_storage.list_files('agents')
for file in files:
content = azure_storage.read_file('agents', file.name)
with open(f'.local_storage/agents/{file.name}', 'w') as f:
f.write(content)
mv local.settings.json local.settings.json.backup
./run.sh
## Best Practices
### Development Workflow
1. **Start with local storage** - Fastest development cycle
2. **Test with local files** - Easy to inspect and modify
3. **Validate with Azure storage** - Before deploying to production
4. **Deploy to Azure** - Production environment
### Code Practices
```python
# Always use the factory
from utils.storage_factory import get_storage_manager
# Never hardcode storage types
# Bad:
storage = AzureFileStorageManager()
# Good:
storage = get_storage_manager()
try:
storage = get_storage_manager()
data = storage.read_json()
except Exception as e:
logging.error(f"Storage error: {e}")
# Handle gracefully
The system logs which storage backend is in use:
INFO: Using local file storage for development
# or
INFO: Using Azure File Storage
Check logs to verify expected behavior.
You can create your own storage backend by implementing the same interface:
class CustomStorageManager:
def __init__(self):
# Your initialization
pass
def set_memory_context(self, guid):
# Implement
pass
def read_json(self):
# Implement
pass
def write_json(self, data):
# Implement
pass
# ... implement all other methods
# Use in storage_factory.py
def get_storage_manager():
if custom_condition:
return CustomStorageManager()
# ... existing logic
The environment-aware storage system provides:
Start developing immediately without Azure credentials, then deploy to Azure when ready - all with the same codebase.