Commit a67917c3 authored by Hoanganhvu123's avatar Hoanganhvu123

feat(epic8+docx): complete Inbound Webhooks CRM Inbox + Full DOCX Document Management

EPIC 8 - INBOUND WEBHOOKS & CRM INBOX 
========================================
Backend:
- Add is_read field to memos table with auto-migration
- Inbound webhook endpoint: POST /inbound_webhooks/{workspace_id}
- Inbox endpoints: GET /users/{id}/inbox/unread_count, GET /users/{id}/inbox/memos
- Webhook payload formatting (title+body → markdown, arbitrary → bullet list)
- Optional secret verification via X-Webhook-Secret header
- Database indexes: idx_memos_is_read, idx_memos_workspace

Frontend:
- Inboxes.tsx: Two tabs (Notifications + Unread Memos)
- useInboxUnreadCount hook with 60s refetch
- useMarkMemoAsRead mutation
- Navigation badge: notifications + unread memos count
- i18n: inbox.* keys (en/vi)

DOCUMENT MANAGEMENT (DOCX/PDF) 🆕
==================================
Backend:
- New table: cuccu_documents with full schema
- DocumentService: upload, list, get, update, delete, process, import_to_memo
- AI enhancement via OpenAI (tag extraction + summarization)
- python-docx integration for text extraction
- Status workflow: PENDING → PROCESSING → COMPLETED/FAILED
- API routes: /documents/upload, /documents, /documents/{id}, /documents/{id}/process, /documents/{id}/import

Frontend:
- New page: DocumentsPage (route: /app/documents)
- Components: UploadArea, DocumentList, DocumentCard, PreviewPanel
- Hooks: useDocuments, useDocumentUpload, useDocumentProcess, useDocumentImport
- AI enhancement panel with toggles (extract tags, generate summary)
- Drag-and-drop file upload with validation
- Status badges, tag display, content preview
- Workspace isolation support
- Full i18n support (en/vi)

Files changed:
- Backend: 15+ files (new: api/documents/, common/sqlite_client.py updates, schemas, services, config)
- Frontend: 20+ files (new: components/Documents/, hooks/useDocuments.ts, docxService.ts, pages/DocumentsPage.tsx)
- Tests: test_inbox.py (5 tests passing)
- Plan docs: epic-8-inbound-webhooks-crm-inbox-done.md, docx-frontend-detailed-plan.md

🤖 Generated with Claude Code
parent f648ebbc
# CLAUDE.md
Behavioral guidelines to reduce common LLM coding mistakes. Merge with project-specific instructions as needed.
**Tradeoff:** These guidelines bias toward caution over speed. For trivial tasks, use judgment.
## 1. Think Before Coding
**Don't assume. Don't hide confusion. Surface tradeoffs.**
Before implementing:
- State your assumptions explicitly. If uncertain, ask.
- If multiple interpretations exist, present them - don't pick silently.
- If a simpler approach exists, say so. Push back when warranted.
- If something is unclear, stop. Name what's confusing. Ask.
## 2. Simplicity First
**Minimum code that solves the problem. Nothing speculative.**
- No features beyond what was asked.
- No abstractions for single-use code.
- No "flexibility" or "configurability" that wasn't requested.
- No error handling for impossible scenarios.
- If you write 200 lines and it could be 50, rewrite it.
Ask yourself: "Would a senior engineer say this is overcomplicated?" If yes, simplify.
## 3. Surgical Changes
**Touch only what you must. Clean up only your own mess.**
When editing existing code:
- Don't "improve" adjacent code, comments, or formatting.
- Don't refactor things that aren't broken.
- Match existing style, even if you'd do it differently.
- If you notice unrelated dead code, mention it - don't delete it.
When your changes create orphans:
- Remove imports/variables/functions that YOUR changes made unused.
- Don't remove pre-existing dead code unless asked.
The test: Every changed line should trace directly to the user's request.
## 4. Goal-Driven Execution
**Define success criteria. Loop until verified.**
Transform tasks into verifiable goals:
- "Add validation" → "Write tests for invalid inputs, then make them pass"
- "Fix the bug" → "Write a test that reproduces it, then make it pass"
- "Refactor X" → "Ensure tests pass before and after"
For multi-step tasks, state a brief plan:
```
1. [Step] → verify: [check]
2. [Step] → verify: [check]
3. [Step] → verify: [check]
```
Strong success criteria let you loop independently. Weak criteria ("make it work") require constant clarification.
---
**These guidelines are working if:** fewer unnecessary changes in diffs, fewer rewrites due to overcomplication, and clarifying questions come before implementation rather than after mistakes.
from datetime import datetime, timezone
from typing import Any
from fastapi import APIRouter, HTTPException, Request
from pydantic import BaseModel, EmailStr
from common.jwt_auth import (
create_access_token,
create_refresh_token,
get_password_hash,
verify_password,
decode_token,
)
from common.sqlite_client import sqlite_client as db
router = APIRouter(prefix="/api/v1/auth", tags=["auth"])
def utc_now() -> str:
return datetime.now(timezone.utc).isoformat()
class UserRegister(BaseModel):
username: str
email: EmailStr
password: str
class UserLogin(BaseModel):
username: str
password: str
class RefreshTokenRequest(BaseModel):
refresh_token: str
@router.post("/register")
async def register(user_data: UserRegister):
# Check if username already exists
existing = await db.fetch_one(
"SELECT id FROM cuccu_users WHERE username = ?", (user_data.username,)
)
if existing:
raise HTTPException(status_code=400, detail="Username already registered")
# Check if email already exists
existing_email = await db.fetch_one(
"SELECT id FROM cuccu_users WHERE email = ?", (user_data.email,)
)
if existing_email:
raise HTTPException(status_code=400, detail="Email already registered")
hashed_password = get_password_hash(user_data.password)
now = utc_now()
# Insert new user — sqlite_client.execute() auto-commits
cursor = await db.execute(
"INSERT INTO cuccu_users (username, email, password_hash, nickname, role, created_at) VALUES (?, ?, ?, ?, ?, ?)",
(user_data.username, user_data.email, hashed_password, user_data.username, "user", now),
)
user_id = str(cursor.lastrowid)
access_token = create_access_token(data={"sub": user_id})
refresh_token, expires_at = create_refresh_token(data={"sub": user_id})
await db.execute(
"INSERT INTO cuccu_refresh_tokens (user_id, token, expires_at, created_at) VALUES (?, ?, ?, ?)",
(user_id, refresh_token, expires_at.isoformat(), now),
)
return {
"access_token": access_token,
"refresh_token": refresh_token,
"token_type": "bearer",
"user": {
"id": user_id,
"username": user_data.username,
"email": user_data.email,
},
}
@router.post("/login")
async def login(user_data: UserLogin):
user = await db.fetch_one(
"SELECT * FROM cuccu_users WHERE username = ?", (user_data.username,)
)
if not user:
raise HTTPException(status_code=400, detail="Incorrect username or password")
if not verify_password(user_data.password, user["password_hash"]):
raise HTTPException(status_code=400, detail="Incorrect username or password")
user_id = str(user["id"])
access_token = create_access_token(data={"sub": user_id})
refresh_token, expires_at = create_refresh_token(data={"sub": user_id})
now = utc_now()
await db.execute(
"INSERT INTO cuccu_refresh_tokens (user_id, token, expires_at, created_at) VALUES (?, ?, ?, ?)",
(user_id, refresh_token, expires_at.isoformat(), now),
)
return {
"access_token": access_token,
"refresh_token": refresh_token,
"token_type": "bearer",
"user": {
"id": user_id,
"username": user["username"],
"email": user["email"],
},
}
@router.post("/refresh")
async def refresh(data: RefreshTokenRequest):
token = data.refresh_token
payload = decode_token(token)
if not payload or "sub" not in payload:
raise HTTPException(status_code=401, detail="Invalid refresh token")
user_id = payload["sub"]
stored = await db.fetch_one(
"SELECT id FROM cuccu_refresh_tokens WHERE token = ? AND user_id = ?",
(token, user_id),
)
if not stored:
raise HTTPException(status_code=401, detail="Refresh token revoked or invalid")
access_token = create_access_token(data={"sub": user_id})
return {"access_token": access_token, "token_type": "bearer"}
@router.get("/me")
async def get_me(request: Request):
if not request.state.is_authenticated:
raise HTTPException(status_code=401, detail="Not authenticated")
user_id = request.state.user_id
user = await db.fetch_one(
"SELECT id, username, email, role FROM cuccu_users WHERE id = ?", (user_id,)
)
if not user:
raise HTTPException(status_code=404, detail="User not found")
return {
"id": str(user["id"]),
"username": user["username"],
"email": user["email"],
"role": user["role"] or "user",
}
"""
Document management API routes for DOCX/PDF uploads, processing, and AI enhancement.
"""
import json
import logging
import os
from typing import List, Optional
from fastapi import APIRouter, Depends, File, Form, HTTPException, UploadFile, Query, Request
from starlette.requests import Request
from common.memos_core.schemas import (
DocumentCreate,
DocumentResponse,
DocumentUpdate,
DocumentListResponse,
DocumentProcessRequest,
DocumentImportRequest,
DocumentUploadResponse,
DocumentDeleteResponse,
)
from common.memos_core.services import get_document_service
from common.encryption import mask_api_key
router = APIRouter(prefix="/documents", tags=["documents"])
logger = logging.getLogger(__name__)
@router.post("/upload", response_model=DocumentUploadResponse)
async def upload_document(
request: Request,
file: UploadFile = File(...),
workspace_id: str = Form(default="PERSONAL"),
title: Optional[str] = Form(default=None),
document_service=Depends(get_document_service),
):
"""
Upload a DOCX or PDF document.
- **file**: The document file to upload (DOCX, PDF supported)
- **workspace_id**: Workspace to associate with (PERSONAL, AI_SALES_CRM, etc.)
- **title**: Optional title override (defaults to original filename)
Returns the created document record with status PENDING. Processing happens asynchronously.
"""
# Validate file type
allowed_types = [
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
'application/pdf',
'text/plain'
]
if file.content_type not in allowed_types:
raise HTTPException(
status_code=400,
detail=f"Unsupported file type: {file.content_type}. Supported: DOCX, PDF, TXT"
)
# Read file content
try:
content = await file.read()
file_size = len(content)
except Exception as e:
logger.error(f"Failed to read uploaded file: {e}")
raise HTTPException(status_code=400, detail="Failed to read uploaded file")
# Verify user
user_id = getattr(request.state, "user_id", None) or "1"
# Ensure upload directory exists
upload_dir = document_service.upload_dir
os.makedirs(upload_dir, exist_ok=True)
# Create document record (file will be saved separately)
original_filename = file.filename or "untitled"
safe_filename = f"{generate_uid()}_{original_filename}"
file_path = os.path.join(upload_dir, safe_filename)
# Save file to disk
try:
with open(file_path, "wb") as f:
f.write(content)
except Exception as e:
logger.error(f"Failed to save uploaded file: {e}")
raise HTTPException(status_code=500, detail="Failed to save uploaded file")
# Create DB record
try:
doc_create = DocumentCreate(
filename=safe_filename,
original_filename=original_filename,
file_size=file_size,
mime_type=file.content_type,
workspace_id=workspace_id,
title=title,
)
doc_response = await document_service.create_document(doc_create, user_id)
return DocumentUploadResponse(
success=True,
document=doc_response,
message="Document uploaded successfully"
)
except Exception as e:
# Cleanup file on DB error
if os.path.exists(file_path):
os.remove(file_path)
logger.error(f"Failed to create document record: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("", response_model=DocumentListResponse)
async def list_documents(
request: Request,
workspace_id: Optional[str] = Query(default=None, description="Filter by workspace"),
status: Optional[str] = Query(default=None, description="Filter by status (PENDING, PROCESSING, COMPLETED, FAILED)"),
limit: int = Query(default=50, ge=1, le=100, description="Max number of documents to return"),
offset: int = Query(default=0, ge=0, description="Number of documents to skip"),
document_service=Depends(get_document_service),
):
"""
List documents for the authenticated user.
Supports filtering by workspace and status, with pagination.
"""
user_id = getattr(request.state, "user_id", None) or "1"
return await document_service.list_documents(
creator_id=user_id,
workspace_id=workspace_id,
status=status,
limit=limit,
offset=offset
)
@router.get("/{doc_id}", response_model=DocumentResponse)
async def get_document(
doc_id: int,
request: Request,
document_service=Depends(get_document_service),
):
"""
Get a specific document by ID.
"""
user_id = getattr(request.state, "user_id", None) or "1"
doc = await document_service.get_document(doc_id)
if not doc or doc.creator_id != user_id:
raise HTTPException(status_code=404, detail="Document not found")
return doc
@router.patch("/{doc_id}", response_model=DocumentResponse)
async def update_document(
doc_id: int,
request: Request,
payload: DocumentUpdate,
document_service=Depends(get_document_service),
):
"""
Update document metadata (title, tags, summary, workspace).
"""
user_id = getattr(request.state, "user_id", None) or "1"
doc = await document_service.update_document(doc_id, payload, user_id)
if not doc:
raise HTTPException(status_code=404, detail="Document not found or access denied")
return doc
@router.post("/{doc_id}/process", response_model=DocumentResponse)
async def process_document(
doc_id: int,
request: Request,
payload: Optional[DocumentProcessRequest] = None,
document_service=Depends(get_document_service),
):
"""
Manually trigger document processing and AI enhancement.
Extracts text from the document, generates tags and summary using AI (if enabled).
"""
user_id = getattr(request.state, "user_id", None) or "1"
doc = await document_service.get_document(doc_id)
if not doc or doc.creator_id != user_id:
raise HTTPException(status_code=404, detail="Document not found")
if doc.status == "PROCESSING":
raise HTTPException(status_code=400, detail="Document is already being processed")
# Re-process the document
try:
processed_doc = await document_service.process_document(doc_id, user_id)
return processed_doc
except Exception as e:
logger.error(f"Failed to process document {doc_id}: {e}")
raise HTTPException(status_code=500, detail=f"Processing failed: {str(e)}")
@router.post("/{doc_id}/import", response_model=DocumentResponse)
async def import_document_to_memo(
doc_id: int,
request: Request,
payload: Optional[DocumentImportRequest] = None,
document_service=Depends(get_document_service),
):
"""
Import document content as a new memo.
Creates a new memo with the document's extracted content, optionally with provided tags and visibility.
"""
user_id = getattr(request.state, "user_id", None) or "1"
if payload is None:
payload = DocumentImportRequest()
try:
memo = await document_service.import_to_memo(
doc_id=doc_id,
creator_id=user_id,
memo_title=payload.memo_title,
tags=payload.tags,
visibility=payload.visibility
)
# Return the original document with a note that import succeeded
doc = await document_service.get_document(doc_id)
return doc
except ValueError as e:
raise HTTPException(status_code=400, detail=str(e))
except Exception as e:
logger.error(f"Failed to import document {doc_id}: {e}")
raise HTTPException(status_code=500, detail=f"Import failed: {str(e)}")
@router.delete("/{doc_id}", response_model=DocumentDeleteResponse)
async def delete_document(
doc_id: int,
request: Request,
document_service=Depends(get_document_service),
):
"""
Delete a document and its associated file.
"""
user_id = getattr(request.state, "user_id", None) or "1"
deleted = await document_service.delete_document(doc_id, user_id)
if not deleted:
raise HTTPException(status_code=404, detail="Document not found or access denied")
return DocumentDeleteResponse(success=True, message="Document deleted successfully")
# Helper function (should be in sqlite_client but using here for convenience)
def generate_uid() -> str:
"""Generate a unique ID for documents."""
import secrets
return secrets.token_urlsafe(16)[:22]
"""
Inbound Webhook routes for CuCu Note.
Allows external systems to create memos via webhooks.
"""
import json
import logging
from typing import Optional
from fastapi import APIRouter, Body, HTTPException, Request, Response, Depends
from common.memos_core.schemas import MemoCreate
from common.memos_core.services import get_memo_service
from config import WEBHOOK_SECRET
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/inbound_webhooks", tags=["webhooks"])
@router.post("/{workspace_id}")
async def receive_webhook(
workspace_id: str,
request: Request,
payload: dict = Body(...),
memo_service=Depends(get_memo_service),
):
"""
Receive an inbound webhook and create a memo.
Path params:
workspace_id: Target workspace (PERSONAL, AI_SALES_CRM, etc.)
Body:
Arbitrary JSON payload. Will be formatted into markdown content.
Returns:
The created memo.
"""
try:
# Verify webhook secret if configured
if WEBHOOK_SECRET:
secret = request.headers.get("X-Webhook-Secret")
if not secret or secret != WEBHOOK_SECRET:
raise HTTPException(status_code=403, detail="Invalid webhook secret")
# Format payload into markdown content
content = format_payload_to_markdown(payload)
# Extract tags from payload if available
tags = []
if "tags" in payload and isinstance(payload["tags"], list):
tags = payload["tags"]
elif "tag" in payload and isinstance(payload["tag"], str):
tags = [payload["tag"]]
# Determine visibility: default to PRIVATE, but allow override
visibility = payload.get("visibility", "PRIVATE")
if visibility not in ["PRIVATE", "PROTECTED", "PUBLIC"]:
visibility = "PRIVATE"
# Create memo with is_read=False (inbox/unread)
memo_create = MemoCreate(
content=content,
visibility=visibility,
tags=tags,
workspace_id=workspace_id,
is_read=False, # Inbound webhook memos start as unread
)
# Use memo_service with anonymous user (webhook system user)
memo = await memo_service.create_memo(
payload=memo_create,
user_id=None, # Will become "anonymous" or inherit from system
)
logger.info(
f"✅ Webhook received for workspace={workspace_id}, created memo id={memo.id}"
)
return {
"success": True,
"memo": memo,
"message": "Memo created from webhook",
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Webhook processing failed: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=f"Webhook processing failed: {str(e)}")
def format_payload_to_markdown(payload: dict) -> str:
"""
Convert arbitrary JSON payload into a readable markdown format.
Rules:
- If payload has 'content' key (string), use it directly.
- If payload has 'title' and 'body', format as "# Title\n\nBody"
- Otherwise, pretty-print as markdown table or bullet list.
"""
lines = []
# Title priority: title > subject > heading > event_type > first key capitalized
title = None
for key in ["title", "subject", "heading", "event_type", "event"]:
if key in payload and payload[key]:
title = str(payload[key])
break
if title:
lines.append(f"# {title}")
lines.append("")
# Body/content
if "content" in payload and isinstance(payload["content"], str):
lines.append(str(payload["content"]))
elif "body" in payload and isinstance(payload["body"], str):
lines.append(str(payload["body"]))
else:
# Format remaining fields as bullet list
for key, value in payload.items():
if key in ["title", "subject", "heading", "content", "body", "tags", "tag", "visibility"]:
continue
if isinstance(value, (dict, list)):
lines.append(f"- **{key}**: `{json.dumps(value, ensure_ascii=False)}`")
else:
lines.append(f"- **{key}**: {value}")
return "\n".join(lines)
...@@ -39,13 +39,31 @@ class ImageStorageService: ...@@ -39,13 +39,31 @@ class ImageStorageService:
content_type = "image/jpeg" content_type = "image/jpeg"
import base64 import base64
import piexif
from PIL import Image
import io
file_content = base64.b64decode(base64_data) file_content = base64.b64decode(base64_data)
if not filename: if not filename:
ext = content_type.split("/")[-1] ext = content_type.split("/")[-1]
if ext == "jpeg":
ext = "jpg"
filename = f"{uuid.uuid4()}.{ext}" filename = f"{uuid.uuid4()}.{ext}"
# Strip EXIF metadata to preserve privacy
if content_type in ["image/jpeg", "image/jpg", "image/tiff"]:
try:
img = Image.open(io.BytesIO(file_content))
if "exif" in img.info:
output_io = io.BytesIO()
# Use PIL to save without EXIF data
img.save(output_io, format=img.format, exif=b"")
file_content = output_io.getvalue()
logger.info(f"Stripped EXIF metadata from {filename}")
except Exception as e:
logger.warning(f"Could not strip EXIF data from {filename}: {e}")
# Supabase Storage REST API: /storage/v1/object/{bucket}/{path} # Supabase Storage REST API: /storage/v1/object/{bucket}/{path}
upload_url = f"{self.url}/storage/v1/object/{self.bucket_name}/{filename}" upload_url = f"{self.url}/storage/v1/object/{self.bucket_name}/{filename}"
......
import os
from datetime import datetime, timedelta, timezone
from passlib.context import CryptContext
import jwt
# Secret key should be loaded from config, but fallback here
SECRET_KEY = os.getenv("JWT_SECRET_KEY", "CuCu-Note-Secret-Key-123")
ALGORITHM = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES = 60 * 24 # 1 day for access token
REFRESH_TOKEN_EXPIRE_DAYS = 30 # 30 days for refresh token
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def verify_password(plain_password: str, hashed_password: str) -> bool:
# bcrypt truncates at 72 bytes — must truncate consistently before verify
return pwd_context.verify(plain_password.encode("utf-8")[:72], hashed_password)
def get_password_hash(password: str) -> str:
# bcrypt has a 72-byte limit — truncate to avoid ValueError
return pwd_context.hash(password.encode("utf-8")[:72])
def create_access_token(data: dict, expires_delta: timedelta | None = None) -> str:
to_encode = data.copy()
if expires_delta:
expire = datetime.now(timezone.utc) + expires_delta
else:
expire = datetime.now(timezone.utc) + timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES)
to_encode.update({"exp": expire})
encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
return encoded_jwt
def create_refresh_token(data: dict) -> tuple[str, datetime]:
to_encode = data.copy()
expire = datetime.now(timezone.utc) + timedelta(days=REFRESH_TOKEN_EXPIRE_DAYS)
to_encode.update({"exp": expire})
encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
return encoded_jwt, expire
def decode_token(token: str) -> dict | None:
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
return payload
except jwt.PyJWTError:
return None
...@@ -2,7 +2,7 @@ ...@@ -2,7 +2,7 @@
Thin wrapper để memos_core dùng chung MongoDB client trong `common.mongo_client`. Thin wrapper để memos_core dùng chung MongoDB client trong `common.mongo_client`.
""" """
from common.mongo_client import ( # noqa: F401 from common.mongodb import (
mongodb_client, mongodb_client,
COLLECTION_MEMOS, COLLECTION_MEMOS,
COLLECTION_ATTACHMENTS, COLLECTION_ATTACHMENTS,
...@@ -13,8 +13,9 @@ from common.mongo_client import ( # noqa: F401 ...@@ -13,8 +13,9 @@ from common.mongo_client import ( # noqa: F401
COLLECTION_USER_SETTINGS, COLLECTION_USER_SETTINGS,
generate_uid, generate_uid,
utc_now, utc_now,
serialize_doc, serialize_row,
parse_object_id, parse_time,
format_time,
init_mongodb, init_mongodb,
close_mongodb, close_mongodb,
) )
\ No newline at end of file
...@@ -3,12 +3,28 @@ ...@@ -3,12 +3,28 @@
from __future__ import annotations from __future__ import annotations
from datetime import datetime from datetime import datetime
from enum import Enum
from typing import Any, List, Optional from typing import Any, List, Optional
from pydantic import BaseModel, ConfigDict, EmailStr, Field from pydantic import BaseModel, ConfigDict, EmailStr, Field
from pydantic.aliases import AliasChoices from pydantic.aliases import AliasChoices
class DeadlineStatus(str, Enum):
"""Trạng thái deadline của memo."""
PENDING = "PENDING" # Chưa làm, chưa quá hạn
DONE = "DONE" # Đã hoàn thành
OVERDUE = "OVERDUE" # Quá hạn mà chưa làm
class MemoRelationPriority(str, Enum):
"""Mức độ ưu tiên."""
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
URGENT = "urgent"
class InstanceInfo(BaseModel): class InstanceInfo(BaseModel):
mode: Optional[str] = None mode: Optional[str] = None
version: Optional[str] = None version: Optional[str] = None
...@@ -101,12 +117,19 @@ class MemoBase(BaseModel): ...@@ -101,12 +117,19 @@ class MemoBase(BaseModel):
content: str content: str
visibility: Optional[str] = "PRIVATE" visibility: Optional[str] = "PRIVATE"
tags: List[str] = [] tags: List[str] = []
is_read: Optional[bool] = None
class MemoCreate(MemoBase): class MemoCreate(MemoBase):
create_time: Optional[datetime] = None create_time: Optional[datetime] = None
anonymous_id: Optional[str] = None anonymous_id: Optional[str] = None
anonymous_name: Optional[str] = None anonymous_name: Optional[str] = None
# --- Deadline fields ---
deadline: Optional[datetime] = None # Thời hạn chót (UTC)
priority: Optional[str] = None # low | medium | high | urgent
reminder_at: Optional[datetime] = None # Nhắc nhở trước deadline
# --- Workspace isolation ---
workspace_id: Optional[str] = None # PERSONAL | AI_SALES_CRM
class MemoUpdate(BaseModel): class MemoUpdate(BaseModel):
...@@ -115,6 +138,14 @@ class MemoUpdate(BaseModel): ...@@ -115,6 +138,14 @@ class MemoUpdate(BaseModel):
tags: Optional[List[str]] = None tags: Optional[List[str]] = None
pinned: Optional[bool] = None pinned: Optional[bool] = None
row_status: Optional[str] = None row_status: Optional[str] = None
# --- Deadline fields ---
deadline: Optional[datetime] = None
priority: Optional[str] = None
reminder_at: Optional[datetime] = None
is_completed: Optional[bool] = None
completed_at: Optional[datetime] = None
# --- Inbox fields ---
is_read: Optional[bool] = None
class MemoVersion(BaseModel): class MemoVersion(BaseModel):
...@@ -133,6 +164,14 @@ class MemoVersionResponse(BaseModel): ...@@ -133,6 +164,14 @@ class MemoVersionResponse(BaseModel):
total: int = 0 total: int = 0
class ReactionResponse(BaseModel):
name: str
creator: str
contentId: str
reactionType: str
model_config = ConfigDict(populate_by_name=True)
class MemoResponse(MemoBase): class MemoResponse(MemoBase):
id: str id: str
uid: str = "" uid: str = ""
...@@ -147,10 +186,29 @@ class MemoResponse(MemoBase): ...@@ -147,10 +186,29 @@ class MemoResponse(MemoBase):
parent: Optional[str] = None parent: Optional[str] = None
comment_count: int = 0 comment_count: int = 0
versions: List[MemoVersion] = [] # Version history versions: List[MemoVersion] = [] # Version history
reactions: List[ReactionResponse] = [] # Reactions
# --- Deadline fields ---
deadline: Optional[str] = None # ISO string
deadline_status: Optional[str] = None # PENDING | DONE | OVERDUE
priority: Optional[str] = None # low | medium | high | urgent
reminder_at: Optional[str] = None
is_completed: bool = False
completed_at: Optional[str] = None
# --- Workspace isolation ---
workspace_id: Optional[str] = None # PERSONAL | AI_SALES_CRM
model_config = ConfigDict(populate_by_name=True) model_config = ConfigDict(populate_by_name=True)
class DeadlineListResponse(BaseModel):
"""Response for deadline listing API."""
overdue: List[MemoResponse] = [] # Quá hạn, chưa hoàn thành
today: List[MemoResponse] = [] # Deadline hôm nay
upcoming: List[MemoResponse] = [] # Sắp tới (7 ngày)
no_date: List[MemoResponse] = [] # Có priority nhưng không có deadline
total: int = 0
class MemoEmbeddingCreate(BaseModel): class MemoEmbeddingCreate(BaseModel):
memoId: int memoId: int
content: str content: str
...@@ -260,3 +318,135 @@ class IdentityProviderResponse(BaseModel): ...@@ -260,3 +318,135 @@ class IdentityProviderResponse(BaseModel):
type: str type: str
# ====================== PERSONAL ACCESS TOKENS ======================
class PersonalAccessTokenCreate(BaseModel):
description: str
expires_at: Optional[datetime] = None
class PersonalAccessTokenResponse(BaseModel):
id: str
description: str
created_at: str
last_used_at: Optional[str] = None
expires_at: Optional[str] = None
is_active: bool = True
token: Optional[str] = None # Only included on creation
# ====================== WEBHOOKS ======================
class WebhookCreate(BaseModel):
name: str
url: str
events: List[str]
secret: Optional[str] = None
class WebhookUpdate(BaseModel):
name: Optional[str] = None
url: Optional[str] = None
events: Optional[List[str]] = None
secret: Optional[str] = None
is_active: Optional[bool] = None
class WebhookResponse(BaseModel):
id: str
user_id: str
name: str
url: str
events: List[str]
is_active: bool = True
created_at: str
updated_at: Optional[str] = None
# ====================== DOCUMENTS (DOCX/PDF) ======================
class DocumentBase(BaseModel):
"""Base schema for document."""
filename: str = Field(..., min_length=1, max_length=255)
workspace_id: str = Field(default="PERSONAL", pattern=r"^[A-Z0-9_]+$")
title: Optional[str] = Field(None, max_length=500)
class DocumentCreate(BaseModel):
"""Schema for creating a new document (upload)."""
filename: str = Field(..., min_length=1, max_length=255)
original_filename: str = Field(..., min_length=1, max_length=255)
file_size: int = Field(..., ge=1)
mime_type: str = Field(..., max_length=100)
workspace_id: str = Field(default="PERSONAL", pattern=r"^[A-Z0-9_]+$")
content: Optional[str] = Field(None, description="Extracted markdown content")
pages: Optional[int] = Field(None, ge=1)
class DocumentUpdate(BaseModel):
"""Schema for updating document metadata."""
title: Optional[str] = Field(None, max_length=500)
tags: Optional[List[str]] = Field(None, max_length=50)
summary: Optional[str] = Field(None, max_length=2000)
ai_enhanced: Optional[bool] = None
workspace_id: Optional[str] = Field(None, pattern=r"^[A-Z0-9_]+$")
class DocumentResponse(BaseModel):
"""Schema for document response."""
model_config = ConfigDict(from_attributes=True)
id: int
uid: str
creator_id: str
filename: str
original_filename: str
file_path: str
file_size: int
mime_type: str
workspace_id: str
status: str # PENDING | PROCESSING | COMPLETED | FAILED
pages: Optional[int] = None
title: Optional[str] = None
content: Optional[str] = None
summary: Optional[str] = None
tags: Optional[List[str]] = None
ai_enhanced: bool = False
error_message: Optional[str] = None
created_at: datetime
updated_at: datetime
processed_at: Optional[datetime] = None
class DocumentListResponse(BaseModel):
"""Schema for list of documents."""
documents: List[DocumentResponse] = []
total: int = 0
workspace_filter: Optional[str] = None
class DocumentProcessRequest(BaseModel):
"""Schema for requesting document processing/AI enhancement."""
enhance_with_ai: bool = Field(default=True, description="Enable AI tagging/summary")
generate_summary: bool = Field(default=True, description="Generate AI summary")
extract_tags: bool = Field(default=True, description="Extract tags from content")
class DocumentImportRequest(BaseModel):
"""Schema for importing document content into memos."""
memo_title: Optional[str] = Field(None, max_length=500, description="Title for imported memo")
tags: List[str] = Field(default_factory=list, max_length=50, description="Tags to add to memo")
visibility: str = Field(default="PRIVATE", pattern=r"^(PRIVATE|PROTECTED|PUBLIC)$")
create_as_memo: bool = Field(default=True, description="Create a memo from this document")
class DocumentUploadResponse(BaseModel):
"""Schema for document upload response."""
success: bool
document: DocumentResponse
message: str = "Document uploaded successfully"
class DocumentDeleteResponse(BaseModel):
"""Schema for document deletion response."""
success: bool
message: str = "Document deleted successfully"
This source diff could not be displayed because it is too large. You can view the blob instead.
...@@ -9,8 +9,9 @@ from typing import TYPE_CHECKING ...@@ -9,8 +9,9 @@ from typing import TYPE_CHECKING
from fastapi import Request from fastapi import Request
from common.clerk_auth import verify_clerk_jwt from common.jwt_auth import decode_token
from config import DISABLE_AUTH from config import DISABLE_AUTH
from common.memos_core.services import PersonalAccessTokenService
if TYPE_CHECKING: if TYPE_CHECKING:
from fastapi import FastAPI from fastapi import FastAPI
...@@ -95,7 +96,7 @@ class CuCuAuthMiddleware: ...@@ -95,7 +96,7 @@ class CuCuAuthMiddleware:
return return
# ===================================================================== # =====================================================================
# STEP 1: AUTHENTICATION (Clerk JWT) # STEP 1: AUTHENTICATION (Local JWT)
# ===================================================================== # =====================================================================
try: try:
auth_header = request.headers.get("Authorization") auth_header = request.headers.get("Authorization")
...@@ -108,30 +109,51 @@ class CuCuAuthMiddleware: ...@@ -108,30 +109,51 @@ class CuCuAuthMiddleware:
scope["state"]["is_authenticated"] = False scope["state"]["is_authenticated"] = False
scope["state"]["device_id"] = device_id scope["state"]["device_id"] = device_id
else: else:
# --- TRƯỜNG HỢP 2: CÓ TOKEN -> VERIFY CLERK JWT --- # --- TRƯỜNG HỢP 2: CÓ TOKEN -> VERIFY JWT or PAT ---
token = auth_header.replace("Bearer ", "") token = auth_header.replace("Bearer ", "")
jwt_success = False
try: try:
payload = verify_clerk_jwt(token) payload = decode_token(token)
user_id = payload.get("sub") if payload and "sub" in payload:
if user_id: user_id = payload.get("sub")
scope["state"]["user"] = payload scope["state"]["user"] = payload
scope["state"]["user_id"] = str(user_id) scope["state"]["user_id"] = str(user_id)
scope["state"]["token"] = token scope["state"]["token"] = token
scope["state"]["is_authenticated"] = True scope["state"]["is_authenticated"] = True
scope["state"]["device_id"] = device_id or str(user_id) scope["state"]["device_id"] = device_id or str(user_id)
logger.debug("✅ Clerk Auth Success: user_id=%s", user_id) logger.debug("✅ Local JWT Auth Success: user_id=%s", user_id)
jwt_success = True
else: else:
logger.warning("⚠️ Clerk token missing sub -> Guest Mode") logger.warning("⚠️ JWT token missing sub, trying PAT")
except Exception as e:
logger.warning("⚠️ JWT decode error: %s, trying PAT", e)
# If JWT failed, try Personal Access Token
if not jwt_success:
try:
pat_service = PersonalAccessTokenService()
pat_verify = await pat_service.verify_token(token)
if pat_verify:
user_id = pat_verify["user_id"]
scope["state"]["user"] = {"sub": user_id}
scope["state"]["user_id"] = str(user_id)
scope["state"]["token"] = token
scope["state"]["is_authenticated"] = True
scope["state"]["device_id"] = device_id or str(user_id)
logger.debug("✅ PAT Auth Success: user_id=%s", user_id)
else:
# PAT verification failed -> Guest
logger.warning("⚠️ PAT verification failed -> Guest Mode")
scope["state"]["user"] = None
scope["state"]["user_id"] = None
scope["state"]["is_authenticated"] = False
scope["state"]["device_id"] = device_id
except Exception as pat_e:
logger.error("❌ PAT Auth Error: %s -> Guest Mode", pat_e)
scope["state"]["user"] = None scope["state"]["user"] = None
scope["state"]["user_id"] = None scope["state"]["user_id"] = None
scope["state"]["is_authenticated"] = False scope["state"]["is_authenticated"] = False
scope["state"]["device_id"] = device_id scope["state"]["device_id"] = device_id
except Exception as e:
logger.error("❌ Clerk Auth Error: %s -> Guest Mode", e)
scope["state"]["user"] = None
scope["state"]["user_id"] = None
scope["state"]["is_authenticated"] = False
scope["state"]["device_id"] = device_id
except Exception as e: except Exception as e:
logger.error(f"❌ Middleware Auth Error: {e}") logger.error(f"❌ Middleware Auth Error: {e}")
......
"""
Wrapper MongoDB-like interface dùng SQLite backend.
Cung cấp các attribute và methods tương tự MongoDB client để services.py
không cần thay đổi logic query nhiều (chỉ cần đổi từ .find_one() sang .find_one()).
"""
from common.sqlite_client import (
sqlite_client as _sqlite_client,
generate_uid,
utc_now,
serialize_row,
parse_time,
format_time,
init_sqlite as init_mongodb,
close_sqlite as close_mongodb,
)
# Re-export cho tương thích
mongodb_client = _sqlite_client
# Collection constants (re-export)
from common.sqlite_client import (
TABLE_USERS as COLLECTION_USERS,
TABLE_MEMOS as COLLECTION_MEMOS,
TABLE_ATTACHMENTS as COLLECTION_ATTACHMENTS,
TABLE_MEMO_RELATIONS as COLLECTION_MEMO_RELATIONS,
TABLE_REACTIONS as COLLECTION_REACTIONS,
TABLE_MEMO_EMBEDDINGS as COLLECTION_MEMO_EMBEDDINGS,
TABLE_INBOX as COLLECTION_INBOX,
TABLE_USER_SETTINGS as COLLECTION_USER_SETTINGS,
TABLE_SHORTCUTS as COLLECTION_SHORTCUTS,
TABLE_TEAMS as COLLECTION_TEAMS,
TABLE_TEAM_MEMBERS as COLLECTION_TEAM_MEMBERS,
TABLE_TEAM_MEMOS as COLLECTION_TEAM_MEMOS,
TABLE_TEAM_COMMENTS as COLLECTION_TEAM_COMMENTS,
TABLE_TEAM_REACTIONS as COLLECTION_TEAM_REACTIONS,
TABLE_USER_PROFILES as COLLECTION_USER_PROFILES,
TABLE_REFRESH_TOKENS as COLLECTION_REFRESH_TOKENS,
TABLE_PERSONAL_ACCESS_TOKENS as COLLECTION_PERSONAL_ACCESS_TOKENS,
TABLE_WEBHOOKS as COLLECTION_WEBHOOKS,
)
__all__ = [
"mongodb_client",
"generate_uid",
"utc_now",
"serialize_row",
"parse_time",
"format_time",
"init_mongodb",
"close_mongodb",
"COLLECTION_MEMOS",
"COLLECTION_ATTACHMENTS",
"COLLECTION_MEMO_RELATIONS",
"COLLECTION_REACTIONS",
"COLLECTION_MEMO_EMBEDDINGS",
"COLLECTION_INBOX",
"COLLECTION_USER_SETTINGS",
"COLLECTION_SHORTCUTS",
"COLLECTION_TEAMS",
"COLLECTION_TEAM_MEMBERS",
"COLLECTION_TEAM_MEMOS",
"COLLECTION_TEAM_COMMENTS",
"COLLECTION_TEAM_REACTIONS",
"COLLECTION_USER_PROFILES",
"COLLECTION_USERS",
"COLLECTION_REFRESH_TOKENS",
"COLLECTION_PERSONAL_ACCESS_TOKENS",
"COLLECTION_WEBHOOKS",
]
This diff is collapsed.
...@@ -69,6 +69,7 @@ __all__ = [ ...@@ -69,6 +69,7 @@ __all__ = [
"CONV_SUPABASE_KEY", "CONV_SUPABASE_KEY",
"CONV_SUPABASE_URL", "CONV_SUPABASE_URL",
"DEFAULT_MODEL", "DEFAULT_MODEL",
"DOCUMENT_UPLOAD_DIR", # Add this
"FIRECRAWL_API_KEY", "FIRECRAWL_API_KEY",
"GOOGLE_API_KEY", "GOOGLE_API_KEY",
"GROQ_API_KEY", "GROQ_API_KEY",
...@@ -117,6 +118,7 @@ __all__ = [ ...@@ -117,6 +118,7 @@ __all__ = [
"ENCRYPTION_PASSWORD", "ENCRYPTION_PASSWORD",
"MEILI_URL", "MEILI_URL",
"MEILI_MASTER_KEY", "MEILI_MASTER_KEY",
"WEBHOOK_SECRET",
] ]
# ====================== SUPABASE CONFIGURATION ====================== # ====================== SUPABASE CONFIGURATION ======================
...@@ -225,6 +227,12 @@ MEMO_DB_PATH: str = os.getenv( ...@@ -225,6 +227,12 @@ MEMO_DB_PATH: str = os.getenv(
os.path.join(os.path.dirname(__file__), "db", "memos.db"), os.path.join(os.path.dirname(__file__), "db", "memos.db"),
) )
# Document upload directory
DOCUMENT_UPLOAD_DIR: str = os.getenv(
"DOCUMENT_UPLOAD_DIR",
os.path.join(os.path.dirname(__file__), "uploads", "documents"),
)
# ====================== CORS CONFIGURATION ====================== # ====================== CORS CONFIGURATION ======================
# CORS origins - comma-separated list, or "*" for all origins # CORS origins - comma-separated list, or "*" for all origins
# Default: "*" for development, should be restricted in production # Default: "*" for development, should be restricted in production
...@@ -244,3 +252,6 @@ ENCRYPTION_PASSWORD: str | None = os.getenv("ENCRYPTION_PASSWORD") # Fallback f ...@@ -244,3 +252,6 @@ ENCRYPTION_PASSWORD: str | None = os.getenv("ENCRYPTION_PASSWORD") # Fallback f
# ====================== MEILISEARCH CONFIGURATION ====================== # ====================== MEILISEARCH CONFIGURATION ======================
MEILI_URL: str | None = os.getenv("MEILI_URL") MEILI_URL: str | None = os.getenv("MEILI_URL")
MEILI_MASTER_KEY: str | None = os.getenv("MEILI_MASTER_KEY") MEILI_MASTER_KEY: str | None = os.getenv("MEILI_MASTER_KEY")
# ====================== WEBHOOK CONFIGURATION ======================
WEBHOOK_SECRET: str | None = os.getenv("WEBHOOK_SECRET") # Secret for inbound webhook auth
#!/usr/bin/env python
import os, asyncio, sys
os.environ['MEMO_DB_PATH'] = ':memory:'
from fastapi.testclient import TestClient
from server import app
# Initialize DB
from common.sqlite_client import init_sqlite
asyncio.run(init_sqlite())
client = TestClient(app, raise_server_exceptions=False)
# Generate JWT token directly
from common.jwt_auth import create_access_token
token = create_access_token(data={"sub": "test_user"})
print(f'Generated token: {token}')
headers = {'Authorization': f'Bearer {token}'}
# Create memo without workspace_id
create1 = client.post('/api/v1/memos', json={'content':'test','visibility':'PRIVATE'}, headers=headers)
print(f'Create (no workspace): {create1.status_code}')
if create1.status_code != 200:
print(f'Create error: {create1.text}')
else:
print(f'Created memo: {create1.json()}')
# Create memo with workspace_id
create2 = client.post('/api/v1/memos', json={'content':'test2','visibility':'PRIVATE','workspace_id':'PERSONAL'}, headers=headers)
print(f'Create (with workspace PERSONAL): {create2.status_code}')
if create2.status_code != 200:
print(f'Create error: {create2.text}')
else:
print(f'Created memo: {create2.json()}')
# List memos with workspace filter
list_resp = client.get('/api/v1/memos?workspace_id=PERSONAL', headers=headers)
print(f'List PERSONAL: {list_resp.status_code}')
if list_resp.status_code != 200:
print(f'List error: {list_resp.text}')
else:
print(f'List count: {len(list_resp.json())}')
for m in list_resp.json():
print(f" - {m['id']}: workspace={m.get('workspace_id')}")
import sqlite3
con = sqlite3.connect("db/memos.db")
cur = con.execute("SELECT name FROM sqlite_master WHERE type='table'")
tables = [r[0] for r in cur.fetchall()]
print("Tables:", tables)
# Check if cuccu_refresh_tokens exists
if "cuccu_refresh_tokens" not in tables:
print("MISSING: cuccu_refresh_tokens - creating...")
con.execute("""
CREATE TABLE IF NOT EXISTS cuccu_refresh_tokens (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
token TEXT NOT NULL,
expires_at TEXT,
created_at TEXT
)
""")
con.commit()
print("Created cuccu_refresh_tokens OK")
else:
print("OK: cuccu_refresh_tokens exists")
con.close()
# Core FastAPI # Core FastAPI
fastapi fastapi
uvicorn==0.38.0 uvicorn==0.38.0
uvloop>=0.21.0 # uvloop>=0.21.0 # Linux/Mac only — skip trên Windows
starlette==0.50.0 starlette==0.50.0
pydantic==2.12.5 pydantic==2.12.5
pydantic_core==2.41.5 pydantic_core==2.41.5
# Database - MongoDB # Database - SQLite (replaces MongoDB)
motor==3.7.0 aiosqlite
# Database - PostgreSQL (for LangGraph checkpoints) # Database - PostgreSQL (for LangGraph checkpoints)
psycopg==3.3.2 psycopg==3.3.2
...@@ -24,7 +24,8 @@ meilisearch>=0.31.0 ...@@ -24,7 +24,8 @@ meilisearch>=0.31.0
PyJWT==2.10.1 PyJWT==2.10.1
cryptography==46.0.3 cryptography==46.0.3
email-validator==2.2.0 email-validator==2.2.0
passlib[bcrypt]==1.7.4
bcrypt==3.2.2 # pinned: passlib 1.7.4 is NOT compatible with bcrypt 4.x+
# Async # Async
aiofiles==25.1.0 aiofiles==25.1.0
anyio==4.12.0 anyio==4.12.0
...@@ -72,7 +73,8 @@ tenacity==9.1.2 ...@@ -72,7 +73,8 @@ tenacity==9.1.2
backoff==2.2.1 backoff==2.2.1
regex==2025.11.3 regex==2025.11.3
Unidecode==1.4.0 Unidecode==1.4.0
# pillow==12.0.0 # Removed: not directly imported pillow==12.0.0 # Image processing
piexif==1.1.3 # EXIF stripping
# WebSocket # WebSocket
websockets==15.0.1 websockets==15.0.1
...@@ -107,3 +109,8 @@ pytest==9.0.2 ...@@ -107,3 +109,8 @@ pytest==9.0.2
# Production server # Production server
# gunicorn==23.0.0 # Removed: using uvicorn instead # gunicorn==23.0.0 # Removed: using uvicorn instead
aiosqlite aiosqlite
# Auth
# passlib[bcrypt]==1.7.4 # defined above with bcrypt pin
# Image Privacy
piexif==1.1.3
...@@ -11,8 +11,11 @@ from fastapi.responses import RedirectResponse, ORJSONResponse ...@@ -11,8 +11,11 @@ from fastapi.responses import RedirectResponse, ORJSONResponse
from api.chatbot import router as chatbot_router from api.chatbot import router as chatbot_router
from api.memos import router as memos_router from api.memos import router as memos_router
from api.auth_routes import router as auth_router
from api.test_chat_route import router as test_router from api.test_chat_route import router as test_router
from api.team_routes import router as team_router from api.team_routes import router as team_router
from api.inbound_webhook_routes import router as webhook_router
from api.documents import documents_routes # DOCX/PDF management
from common.cache import redis_cache from common.cache import redis_cache
from common.langfuse_client import get_langfuse_client from common.langfuse_client import get_langfuse_client
from common.meili_service import meili_service from common.meili_service import meili_service
...@@ -48,10 +51,10 @@ async def lifespan(app: FastAPI): ...@@ -48,10 +51,10 @@ async def lifespan(app: FastAPI):
else: else:
logger.info("⚠️ Redis cache unavailable - continuing without cache") logger.info("⚠️ Redis cache unavailable - continuing without cache")
# MongoDB initialization (required) # SQLite initialization (required)
from common.mongo_client import init_mongodb from common.sqlite_client import init_sqlite
await init_mongodb() await init_sqlite()
logger.info("✅ MongoDB connection initialized") logger.info("✅ SQLite connection initialized")
# Meilisearch initialization (optional) # Meilisearch initialization (optional)
meili_ok = await meili_service.initialize() meili_ok = await meili_service.initialize()
...@@ -80,11 +83,11 @@ async def lifespan(app: FastAPI): ...@@ -80,11 +83,11 @@ async def lifespan(app: FastAPI):
logger.debug(f"Error closing Redis: {e}") logger.debug(f"Error closing Redis: {e}")
try: try:
from common.mongo_client import close_mongodb from common.sqlite_client import close_sqlite
await close_mongodb() await close_sqlite()
logger.info("MongoDB connection closed") logger.info("SQLite connection closed")
except Exception as e: except Exception as e:
logger.debug(f"Error closing MongoDB: {e}") logger.debug(f"Error closing SQLite: {e}")
app = FastAPI( app = FastAPI(
...@@ -108,9 +111,12 @@ middleware_manager.setup( ...@@ -108,9 +111,12 @@ middleware_manager.setup(
) )
app.include_router(test_router) # No-auth test endpoints app.include_router(test_router) # No-auth test endpoints
app.include_router(auth_router)
app.include_router(chatbot_router) app.include_router(chatbot_router)
app.include_router(memos_router) app.include_router(memos_router)
app.include_router(team_router) app.include_router(team_router)
app.include_router(webhook_router) # Inbound webhooks
app.include_router(documents_routes) # DOCX/PDF document management
# ========================================== # ==========================================
# Mount static HTML # Mount static HTML
......
"""Quick test to verify SQLite setup works."""
import sys
sys.path.insert(0, '.')
import asyncio
from common.sqlite_client import init_sqlite, sqlite_client
async def test():
print("Connecting to SQLite...")
await init_sqlite()
print("Database initialized")
# Test insert
await sqlite_client.execute(
"INSERT INTO cuccu_users (email, nickname, username) VALUES (?, ?, ?)",
("test@example.com", "Test User", "testuser")
)
print("Inserted test user")
# Test query
row = await sqlite_client.fetch_one(
"SELECT * FROM cuccu_users WHERE email = ?",
("test@example.com",)
)
if row:
print(f"Found user: id={row['id']}, email={row['email']}")
else:
print("User not found")
# Test memos table
await sqlite_client.execute(
"INSERT INTO cuccu_memos (uid, creator_id, content, visibility) VALUES (?, ?, ?, ?)",
("test123", "1", "Test memo content", "PUBLIC")
)
print("Inserted test memo")
rows = await sqlite_client.fetch_all("SELECT * FROM cuccu_memos", ())
print(f"Total memos: {len(rows)}")
print("\nAll tests passed!")
if __name__ == "__main__":
asyncio.run(test())
""" """
Pytest configuration and fixtures for backend tests. Pytest configuration and fixtures for backend tests.
""" """
import os
import pytest import pytest
from fastapi.testclient import TestClient from fastapi.testclient import TestClient
from server import app from server import app
# Override MEMO_DB_PATH for tests before importing sqlite_client
os.environ.setdefault("MEMO_DB_PATH", ":memory:")
@pytest.fixture @pytest.fixture
def client(): def client():
"""FastAPI test client.""" """FastAPI test client."""
return TestClient(app) with TestClient(app, raise_server_exceptions=True) as client:
yield client
@pytest.fixture @pytest.fixture
...@@ -41,30 +47,12 @@ def test_memo_data(): ...@@ -41,30 +47,12 @@ def test_memo_data():
@pytest.fixture @pytest.fixture
async def auth_token(client, test_user_data): def auth_token():
"""Get authentication token for test user.""" """Generate a valid JWT token for testing."""
# Sign up test user from common.jwt_auth import create_access_token
signup_response = client.post( # Create token with user ID "1" to match test URLs
"/api/v1/auth/signup", token = create_access_token(data={"sub": "1"})
json=test_user_data return token
)
if signup_response.status_code == 200:
return signup_response.json().get("token")
# If user exists, sign in
signin_response = client.post(
"/api/v1/auth/signin",
json={
"email": test_user_data["email"],
"password": test_user_data["password"],
}
)
if signin_response.status_code == 200:
return signin_response.json().get("token")
return None
@pytest.fixture @pytest.fixture
...@@ -80,6 +68,6 @@ async def cleanup_test_data(): ...@@ -80,6 +68,6 @@ async def cleanup_test_data():
"""Cleanup test data after tests.""" """Cleanup test data after tests."""
yield yield
# Clean up test memos, comments, etc. # Clean up test memos, comments, etc.
# This will be implemented based on test data patterns
pass pass
"""Test inbox unread memo functionality."""
import pytest
from fastapi.testclient import TestClient
from server import app
@pytest.fixture
def client():
with TestClient(app) as c:
yield c
def test_inbox_unread_count(client, authenticated_headers):
"""Test that /users/{id}/inbox/unread_count returns correct count."""
# Create some memos with is_read=False
client.post("/api/v1/memos", json={
"content": "Unread memo 1",
"visibility": "PRIVATE",
"workspace_id": "PERSONAL",
"is_read": False
}, headers=authenticated_headers)
client.post("/api/v1/memos", json={
"content": "Unread memo 2",
"visibility": "PRIVATE",
"workspace_id": "PERSONAL",
"is_read": False
}, headers=authenticated_headers)
# Create a read memo
client.post("/api/v1/memos", json={
"content": "Read memo",
"visibility": "PRIVATE",
"workspace_id": "PERSONAL",
"is_read": True
}, headers=authenticated_headers)
# Get unread count
response = client.get("/api/v1/users/1/inbox/unread_count", headers=authenticated_headers)
assert response.status_code == 200
data = response.json()
assert "unread_count" in data
assert data["unread_count"] >= 2 # At least 2 unread
def test_inbox_list_unread_memos(client, authenticated_headers):
"""Test that /users/{id}/inbox/memos returns only unread memos."""
# Clear existing and create test data
response = client.get("/api/v1/users/1/inbox/memos", headers=authenticated_headers)
assert response.status_code == 200
memos = response.json()
# All returned memos should have is_read=False
for memo in memos:
assert memo.get("is_read") is False, f"Memo {memo['id']} should be unread"
def test_mark_memo_as_read(client, authenticated_headers):
"""Test marking a memo as read via PATCH /memos/{id}."""
# Create an unread memo
create_response = client.post("/api/v1/memos", json={
"content": "Memo to mark as read",
"visibility": "PRIVATE",
"workspace_id": "PERSONAL",
"is_read": False
}, headers=authenticated_headers)
memo_id = create_response.json()["id"]
# Verify it's unread
unread_response = client.get("/api/v1/users/1/inbox/unread_count", headers=authenticated_headers)
initial_count = unread_response.json()["unread_count"]
# Mark as read
patch_response = client.patch(f"/api/v1/memos/{memo_id}", json={
"is_read": True
}, headers=authenticated_headers)
assert patch_response.status_code == 200
updated_memo = patch_response.json()
assert updated_memo.get("is_read") is True
# Verify count decreased
after_response = client.get("/api/v1/users/1/inbox/unread_count", headers=authenticated_headers)
after_count = after_response.json()["unread_count"]
assert after_count == initial_count - 1
def test_webhook_creates_unread_memo(client):
"""Test that inbound webhook creates memo with is_read=False."""
workspace_id = "AI_SALES_CRM"
webhook_payload = {
"title": "Test Webhook",
"body": "This is a test webhook payload",
"tags": ["webhook", "test"]
}
# Note: webhook endpoint does NOT require auth and path is /inbound_webhooks (not /api/v1/inbound_webhooks)
response = client.post(
f"/inbound_webhooks/{workspace_id}",
json=webhook_payload
)
assert response.status_code == 200, f"Failed: {response.text}"
data = response.json()
memo = data["memo"]
# Verify memo was created in correct workspace with is_read=False
assert memo["workspace_id"] == workspace_id
assert memo.get("is_read") is False
def test_webhook_formats_payload_to_markdown(client):
"""Test that webhook properly formats various payloads to markdown."""
# Test with title + body
response1 = client.post("/inbound_webhooks/PERSONAL", json={
"title": "My Title",
"body": "My body content"
})
response_data1 = response1.json()
# Webhook returns {"success": True, "memo": {...}, "message": "..."}
if "memo" in response_data1:
memo1 = response_data1["memo"]
else:
print(f"Unexpected response: {response_data1}")
raise AssertionError(f"Response missing 'memo' key: {response_data1}")
assert "# My Title" in memo1["content"]
assert "My body content" in memo1["content"]
# Test with only arbitrary fields
response2 = client.post("/inbound_webhooks/PERSONAL", json={
"event_type": "user.signup",
"user_id": 123,
"timestamp": "2024-01-01"
})
response_data2 = response2.json()
if "memo" in response_data2:
memo2 = response_data2["memo"]
else:
print(f"Unexpected response: {response_data2}")
raise AssertionError(f"Response missing 'memo' key: {response_data2}")
assert "**event_type**: user.signup" in memo2["content"]
assert "**user_id**: 123" in memo2["content"]
...@@ -227,3 +227,169 @@ class TestDeleteMemo: ...@@ -227,3 +227,169 @@ class TestDeleteMemo:
status.HTTP_200_OK # If soft delete status.HTTP_200_OK # If soft delete
] ]
class TestWorkspaceIsolation:
"""Test workspace isolation feature (PERSONAL vs AI_SALES_CRM)."""
def test_workspace_default_personal(self, client, authenticated_headers, test_memo_data):
"""Test that creating a memo without workspace_id defaults to PERSONAL."""
# Create memo without specifying workspace_id
response = client.post(
"/api/v1/memos",
json=test_memo_data,
headers=authenticated_headers
)
assert response.status_code in [status.HTTP_200_OK, status.HTTP_201_CREATED]
memo = response.json()
# Check workspace_id defaults to PERSONAL
assert memo.get("workspace_id") == "PERSONAL"
def test_workspace_isolation(self, client, authenticated_headers):
"""Test that memos are properly isolated by workspace_id."""
# Create memo in PERSONAL workspace
personal_memo = client.post(
"/api/v1/memos",
json={
"content": "Personal memo",
"visibility": "PRIVATE",
"workspace_id": "PERSONAL"
},
headers=authenticated_headers
)
assert personal_memo.status_code in [200, 201]
personal_memo_id = personal_memo.json()["id"]
# Create memo in AI_SALES_CRM workspace
ai_memo = client.post(
"/api/v1/memos",
json={
"content": "AI Sales CRM memo",
"visibility": "PRIVATE",
"workspace_id": "AI_SALES_CRM"
},
headers=authenticated_headers
)
assert ai_memo.status_code in [200, 201]
ai_memo_id = ai_memo.json()["id"]
# Fetch PERSONAL workspace memos
personal_response = client.get(
"/api/v1/memos?workspace_id=PERSONAL",
headers=authenticated_headers
)
assert personal_response.status_code == status.HTTP_200_OK
personal_memos = personal_response.json()
# Should contain personal memo
assert len(personal_memos) >= 1
personal_ids = [m["id"] for m in personal_memos]
assert personal_memo_id in personal_ids
# Should NOT contain AI memo
assert ai_memo_id not in personal_ids
# Fetch AI_SALES_CRM workspace memos
ai_response = client.get(
"/api/v1/memos?workspace_id=AI_SALES_CRM",
headers=authenticated_headers
)
assert ai_response.status_code == status.HTTP_200_OK
ai_memos = ai_response.json()
# Should contain AI memo
assert len(ai_memos) >= 1
ai_ids = [m["id"] for m in ai_memos]
assert ai_memo_id in ai_ids
# Should NOT contain personal memo
assert personal_memo_id not in ai_ids
# Verify workspace_id in responses
for memo in personal_memos:
if memo["id"] == personal_memo_id:
assert memo.get("workspace_id") == "PERSONAL"
for memo in ai_memos:
if memo["id"] == ai_memo_id:
assert memo.get("workspace_id") == "AI_SALES_CRM"
def test_workspace_comment_inheritance(self, client, authenticated_headers):
"""Test that comments inherit workspace_id from parent memo."""
# Create parent memo in AI_SALES_CRM workspace
parent_response = client.post(
"/api/v1/memos",
json={
"content": "Parent memo in AI workspace",
"visibility": "PRIVATE",
"workspace_id": "AI_SALES_CRM"
},
headers=authenticated_headers
)
assert parent_response.status_code in [200, 201]
parent_memo = parent_response.json()
parent_id = parent_memo["id"]
parent_workspace = parent_memo.get("workspace_id")
assert parent_workspace == "AI_SALES_CRM"
# Create comment on parent memo (no workspace_id in payload)
comment_response = client.post(
f"/api/v1/memos/{parent_id}/comments",
json={
"content": "Comment on AI memo",
"visibility": "PRIVATE"
},
headers=authenticated_headers
)
assert comment_response.status_code in [200, 201]
comment = comment_response.json()
comment_id = comment["id"]
# Comment should inherit AI_SALES_CRM workspace
assert comment.get("workspace_id") == "AI_SALES_CRM"
# Verify by fetching comment directly
fetched_comment = client.get(
f"/api/v1/memos/{comment_id}",
headers=authenticated_headers
)
assert fetched_comment.status_code == 200
fetched_data = fetched_comment.json()
assert fetched_data.get("workspace_id") == "AI_SALES_CRM"
def test_workspace_isolation_get_memo(self, client, authenticated_headers):
"""Test that get_memo respects workspace_id parameter."""
# Create two memos in different workspaces
personal = client.post(
"/api/v1/memos",
json={"content": "Personal", "visibility": "PRIVATE", "workspace_id": "PERSONAL"},
headers=authenticated_headers
)
ai = client.post(
"/api/v1/memos",
json={"content": "AI", "visibility": "PRIVATE", "workspace_id": "AI_SALES_CRM"},
headers=authenticated_headers
)
personal_id = personal.json()["id"]
ai_id = ai.json()["id"]
# Get personal memo without workspace filter (allowed if owner)
personal_get = client.get(f"/api/v1/memos/{personal_id}", headers=authenticated_headers)
assert personal_get.status_code == status.HTTP_200_OK
# Get AI memo without workspace filter (allowed if owner)
ai_get = client.get(f"/api/v1/memos/{ai_id}", headers=authenticated_headers)
assert ai_get.status_code == status.HTTP_200_OK
# Try to get AI memo with workspace_id=PERSONAL should be denied
ai_as_personal = client.get(
f"/api/v1/memos/{ai_id}?workspace_id=PERSONAL",
headers=authenticated_headers
)
assert ai_as_personal.status_code in [status.HTTP_404_NOT_FOUND, status.HTTP_403_FORBIDDEN]
# Try to get personal memo with workspace_id=AI_SALES_CRM should be denied
personal_as_ai = client.get(
f"/api/v1/memos/{personal_id}?workspace_id=AI_SALES_CRM",
headers=authenticated_headers
)
assert personal_as_ai.status_code in [status.HTTP_404_NOT_FOUND, status.HTTP_403_FORBIDDEN]
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
memos @ 583c3d24
Subproject commit 583c3d24f4d785faa5e034c1b88ed90eda119baa
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment