feat: add backend api specification document
This commit introduces a new document, `docs/01-backend-api-specification.md`, which outlines the backend API specification for the KROW Workforce platform. This document serves as a reference for the development of the backend using Firebase Data Connect and Cloud Functions, replacing the Base44 backend. It includes details on authentication, data API, services API, security, migration notes, and implementation priority. feat: remove obsolete documentation files This commit removes several obsolete documentation files that are no longer relevant to the project. The removed files include: - `docs/01-product-functional-roadmap.md`: This file contained the product roadmap, which is now outdated. - `docs/02-architecture-overview.md`: This file provided an overview of the architecture, which has been superseded by more recent documentation. - `docs/03-backend-api-specification.md`: This file contained the backend API specification, which has been replaced by `docs/01-backend-api-specification.md`. - `docs/04-strategy-technical-roadmap.md`: This file outlined the technical roadmap, which is now outdated. - `docs/05-project-plan.md`: This file contained the project plan, which is no longer current. - `docs/06-maintenance-guide.md`: This file provided a maintenance guide, which is no longer applicable. - `docs/07-reference-base44-api-export-v3.md`: This file contained a reference to the Base44 API export, which is no longer needed. chore: remove obsolete documentation files This commit removes several documentation files that are no longer relevant or have been superseded by other documentation. Removing these files helps to keep the repository clean and organized. The following files were removed: - `docs/07-reference-base44-api-export.md` - `docs/08-reference-base44-prompts.md` - `docs/09-sred-tracking.md` - `docs/10-development-conventions.md` - `docs/flows/vendor-flow.md` - `docs/issues/template.md` - `docs/prompts/create-codemagic-monorepo.md` - `docs/prompts/create-full-architecture-diagram-flutter.md` - `docs/prompts/create-mermaid-be-diagrams-flutter.md` - `docs/prompts/create-mermaid-overview-flutter.md` - `docs/prompts/create-mermaid-usecase-flutter.md`
This commit is contained in:
@@ -1,7 +1,13 @@
|
|||||||
# KROW Workforce API Specification (GCP Migration) - Version 3.0
|
# KROW Workforce API Specification (Legacy Reference)
|
||||||
|
|
||||||
**Version:** 3.0
|
> [!WARNING]
|
||||||
**Date:** 2025-11-20
|
> **Status: LEGACY / ARCHIVED REFERENCE**
|
||||||
|
> This document is based on a historical export from the Base44 platform.
|
||||||
|
> It is maintained in this repository solely for reference purposes during the rebuild and is **not** to be considered the definitive or active API specification for the production system.
|
||||||
|
> The actual data schemas and operations are now defined directly within `backend/dataconnect/`.
|
||||||
|
|
||||||
|
**Original Version:** 3.0
|
||||||
|
**Original Date:** 2025-11-20
|
||||||
**Objective:** This document defines the backend API to be built on the Firebase/GCP ecosystem, replacing the Base44 backend. It is based on the comprehensive Base44 API documentation (v3.0) and will guide the development of the new backend using Firebase Data Connect and Cloud Functions.
|
**Objective:** This document defines the backend API to be built on the Firebase/GCP ecosystem, replacing the Base44 backend. It is based on the comprehensive Base44 API documentation (v3.0) and will guide the development of the new backend using Firebase Data Connect and Cloud Functions.
|
||||||
|
|
||||||
---
|
---
|
||||||
@@ -1,84 +0,0 @@
|
|||||||
# Product Roadmap: KROW Platform Foundation (3-Month Plan)
|
|
||||||
|
|
||||||
**Project Name:** KROW - New Platform Foundation
|
|
||||||
|
|
||||||
**Context:** We are leveraging the validated visual prototype from Base44 to rebuild the KROW platform on a modern, scalable, and proprietary technical infrastructure. This ensures a smooth transition and provides clear visibility on delivered features.
|
|
||||||
|
|
||||||
**3-Month Goal:** To have a functional and autonomous first version of the KROW platform, including a web dashboard for event management and a mobile app for staff, all running on our new Google Cloud infrastructure.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### **Detailed Product Roadmap (Client/CEO View)**
|
|
||||||
|
|
||||||
#### **Phase 1: From Vision to Technical Foundation**
|
|
||||||
|
|
||||||
**Goal of the Phase (for the client):** "We will transform your prototype into a concrete action plan. By the end of this phase, we will have a first version of the dashboard deployed where you can track our progress, and we will be fully independent to build and deploy future applications."
|
|
||||||
|
|
||||||
**Visible Features Delivered at the End of Phase 1:**
|
|
||||||
|
|
||||||
1. **A Preview Web Dashboard (React):**
|
|
||||||
* **Description:** A first version of the dashboard (based on the Base44 export) will be accessible via a private URL. Initially, the data will be static, but the visual interface will be live.
|
|
||||||
* **What this means for the client:** You will see the design and navigation of your dashboard come to life outside of Base44, on our own infrastructure.
|
|
||||||
|
|
||||||
2. **A Foundation for the New Mobile Apps:**
|
|
||||||
* **Description:** The skeletons of the new Flutter apps (Staff and Client) will be created, with a functional login system.
|
|
||||||
* **What this means for the client:** The foundations for the future apps will be ready, and we will have validated our ability to deploy them automatically to the app stores (TestFlight for Apple, Internal Testing for Google).
|
|
||||||
|
|
||||||
3. **A Documented V1 API Contract:**
|
|
||||||
* **Description:** A clear document that precisely describes how the frontend (web and mobile) will communicate with the backend. This is the "blueprint" of our system.
|
|
||||||
* **What this means for the client:** This guarantees that all parts of the KROW ecosystem (web, mobile, backend) will speak the same language, ensuring consistency and speed for future developments.
|
|
||||||
|
|
||||||
**Technical Work Behind the Scenes (for the team):**
|
|
||||||
* Analysis of the Base44 code.
|
|
||||||
* Setup of the infrastructure on Google Cloud (Cloud SQL, Firebase Auth, Data Connect).
|
|
||||||
* Configuration of CI/CD pipelines for web (e.g., GitHub Actions) and mobile (e.g., CodeMagic).
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
#### **Phase 2: The First Features Come to Life**
|
|
||||||
|
|
||||||
**Goal of the Phase (for the client):** "You will start seeing your own data (events, staff) appear in the new dashboard. The staff mobile app will no longer be an empty shell but will display actual shifts. We are establishing a workflow that will allow you to continue prototyping in parallel with our development."
|
|
||||||
|
|
||||||
**Visible Features Delivered at the End of Phase 2:**
|
|
||||||
|
|
||||||
1. **An Event Management Dashboard (Read-only):**
|
|
||||||
* **Description:** The web dashboard will display the list of your events, hubs, and staff, reading the data directly from our new database. Create/edit functionalities will not be active yet.
|
|
||||||
* **What this means for the client:** You will be able to view your real operational data on the new interface, validating that our backend is correctly connected.
|
|
||||||
|
|
||||||
2. **The Staff Mobile App (v2) Displays Real Shifts:**
|
|
||||||
* **Description:** A staff member will be able to log into the new app and see the list of shifts they are assigned to, with basic details (date, location, role).
|
|
||||||
* **What this means for the client:** The synergy between the backend and the mobile app is proven. The foundation is ready to add interactive features.
|
|
||||||
|
|
||||||
3. **An Established Design-to-Development Iteration Workflow:**
|
|
||||||
* **Description:** We will have a clear process to "freeze" a version of the Base44 design for development, while giving you the freedom to work on the next version.
|
|
||||||
* **What this means for the client:** You can continue to innovate and prototype on Base44 without fear of disrupting the development team's work.
|
|
||||||
|
|
||||||
**Technical Work Behind the Scenes:**
|
|
||||||
* Development of Data Connect queries to read data (`listEvents`, `listStaff`, etc.).
|
|
||||||
* Integration of TanStack Query in the dashboard for data management.
|
|
||||||
* Implementation of the service layer in the Flutter app to call the Data Connect SDK.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
#### **Phase 3: The First Complete Business Flow**
|
|
||||||
|
|
||||||
**Goal of the Phase (for the client):** "By the end of this quarter, you will be able to perform a complete business action on the new platform: create a complex event on the web dashboard and instantly see the corresponding shifts appear on the staff mobile app."
|
|
||||||
|
|
||||||
**Visible Features Delivered at the End of Phase 3:**
|
|
||||||
|
|
||||||
1. **A Complete Event Creation and Modification Flow on the Web Dashboard:**
|
|
||||||
* **Description:** The event creation form will be fully functional. You will be able to create an event, add shifts, define positions with roles and rates, and assign contacts.
|
|
||||||
* **What this means for the client:** **You are now autonomous in managing the core of your operations on the new platform.** The Base44 prototype has been transformed into a functional production tool.
|
|
||||||
|
|
||||||
2. **Synchronization Between Web and Mobile:**
|
|
||||||
* **Description:** A shift created or modified on the web dashboard will be immediately (or almost immediately) visible and updated on the mobile app of the concerned staff member.
|
|
||||||
* **What this means for the client:** The vision of a unified and reactive ecosystem is now a tangible reality.
|
|
||||||
|
|
||||||
3. **A Stabilized Platform Ready for Growth:**
|
|
||||||
* **Description:** The technical foundation will have been tested, documented, and secured.
|
|
||||||
* **What this means for the client:** We have a solid foundation on which we can confidently build the more advanced features of your vision (KROW University, predictive AI, etc.).
|
|
||||||
|
|
||||||
**Technical Work Behind the Scenes:**
|
|
||||||
* Development of complex Data Connect mutations for create/edit logic.
|
|
||||||
* Implementation of unit tests and security reviews.
|
|
||||||
* Finalization of the V1 API documentation.
|
|
||||||
@@ -1,95 +0,0 @@
|
|||||||
# KROW Project Workflows
|
|
||||||
|
|
||||||
This document contains diagrams describing the technical architecture and collaboration processes for the project.
|
|
||||||
|
|
||||||
## 1. Web App Migration Architecture
|
|
||||||
|
|
||||||
This diagram illustrates the migration workflow for the web application. It shows how the UI is exported from the Base44 environment and then connected to our new, unified backend built on Firebase services.
|
|
||||||
|
|
||||||
```mermaid
|
|
||||||
graph LR
|
|
||||||
subgraph Base44 Environment
|
|
||||||
direction TB
|
|
||||||
Client[Client] -- Modifies --> B44_UI[<b>Base44 Visual Builder</b><br><i>Features:</i><br>- Event Management<br>- Staff Directory]
|
|
||||||
B44_UI --> B44_Backend[<b>Base44 Backend</b><br>Provides Schemas & SDK]
|
|
||||||
end
|
|
||||||
|
|
||||||
subgraph Firebase Ecosystem - GCP
|
|
||||||
direction TB
|
|
||||||
KROW_FE[<b>KROW Web Frontend</b><br>Vite/React + TanStack Query]
|
|
||||||
|
|
||||||
subgraph Firebase Services
|
|
||||||
direction TB
|
|
||||||
Auth[Firebase Authentication]
|
|
||||||
DataConnect[<b>Firebase Data Connect</b><br>GraphQL API]
|
|
||||||
SQL_DB[<b>Cloud SQL for PostgreSQL</b>]
|
|
||||||
end
|
|
||||||
|
|
||||||
KROW_FE -- "Uses" --> Auth
|
|
||||||
KROW_FE -- "Calls API via SDK" --> DataConnect
|
|
||||||
DataConnect -- "Manages & Queries" --> SQL_DB
|
|
||||||
end
|
|
||||||
|
|
||||||
B44_UI -- "<b>UI Code Export</b>" --> KROW_FE
|
|
||||||
|
|
||||||
style Client fill:#f9f,stroke:#333,stroke-width:2px
|
|
||||||
style B44_UI fill:#ffe,stroke:#333,stroke-width:2px
|
|
||||||
style KROW_FE fill:#eef,stroke:#333,stroke-width:2px
|
|
||||||
```
|
|
||||||
|
|
||||||
## 2. Mobile App Architecture
|
|
||||||
|
|
||||||
This diagram shows how the native mobile applications (Client and Staff) connect to the centralized Firebase backend. This backend is the same one used by the web application.
|
|
||||||
|
|
||||||
```mermaid
|
|
||||||
graph TD
|
|
||||||
subgraph KROW Mobile Applications
|
|
||||||
direction LR
|
|
||||||
Mobile_Client[<b>Mobile Client App</b><br>Flutter]
|
|
||||||
Mobile_Staff[<b>Mobile Staff App</b><br>Flutter]
|
|
||||||
end
|
|
||||||
|
|
||||||
subgraph Firebase Backend Services - GCP
|
|
||||||
direction TB
|
|
||||||
Auth[Firebase Authentication]
|
|
||||||
DataConnect[<b>Firebase Data Connect</b><br>GraphQL API &<br>Generated SDKs]
|
|
||||||
SQL_DB[<b>Cloud SQL for PostgreSQL</b><br><i>Managed by Data Connect</i>]
|
|
||||||
end
|
|
||||||
|
|
||||||
Mobile_Client -- "Authenticates with" --> Auth
|
|
||||||
Mobile_Client -- "Calls API via generated SDK" --> DataConnect
|
|
||||||
|
|
||||||
Mobile_Staff -- "Authenticates with" --> Auth
|
|
||||||
Mobile_Staff -- "Calls API via generated SDK" --> DataConnect
|
|
||||||
|
|
||||||
DataConnect -- "Manages & Queries" --> SQL_DB
|
|
||||||
|
|
||||||
style Mobile_Client fill:#eef,stroke:#333,stroke-width:2px
|
|
||||||
style Mobile_Staff fill:#eef,stroke:#333,stroke-width:2px
|
|
||||||
```
|
|
||||||
|
|
||||||
## 3. Collaboration Workflow for Modifications
|
|
||||||
|
|
||||||
This diagram formalizes the process to follow for any modifications initiated by the client on the Base44 platform. The objective is to control the pace of changes and evaluate their impact on our backend before integration.
|
|
||||||
|
|
||||||
```mermaid
|
|
||||||
flowchart TD
|
|
||||||
A[Client identifies a need<br>for modification] --> B{Define functionality<br>and scope};
|
|
||||||
|
|
||||||
B --> C{Does the modification impact<br>only the UI or also<br>logic/data?};
|
|
||||||
|
|
||||||
C -- "UI Only" --> D[Client makes modifications<br>on Base44];
|
|
||||||
C -- "Logic/Data" --> E[Team-Client Coordination<br>to assess impact on GCP backend];
|
|
||||||
|
|
||||||
D --> F[Planned export of the<br>new UI version];
|
|
||||||
E --> F;
|
|
||||||
|
|
||||||
F --> G["Developer runs the automation<br>1. `make integrate-export`<br>2. `make prepare-export`"];
|
|
||||||
|
|
||||||
G --> H[Development & Testing<br>- Adapt GCP backend if needed<br>- Validate locally];
|
|
||||||
|
|
||||||
H --> I[✅ Integration complete];
|
|
||||||
|
|
||||||
style A fill:#f9f,stroke:#333,stroke-width:2px
|
|
||||||
style D fill:#f9f,stroke:#333,stroke-width:2px
|
|
||||||
```
|
|
||||||
@@ -1,131 +0,0 @@
|
|||||||
# KROW Workforce API Specification (GCP Migration)
|
|
||||||
|
|
||||||
**Version:** 2.0
|
|
||||||
**Date:** 2025-11-11
|
|
||||||
**Objective:** This document defines the backend API to be built on the Firebase/GCP ecosystem, replacing the Base44 backend. It is based on the comprehensive Base44 API documentation (v2.0) and will guide the development of the new backend using Firebase Data Connect and Cloud Functions.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## General Conventions
|
|
||||||
|
|
||||||
- **API Layer:** The backend will be composed of two main parts:
|
|
||||||
- **Firebase Data Connect:** A GraphQL API for all data-centric (CRUD) operations.
|
|
||||||
- **Cloud Functions:** A set of RESTful endpoints for all service-centric operations (e.g., sending emails, processing files).
|
|
||||||
- **Authentication:** Every request must include an `Authorization: Bearer <Firebase-Auth-Token>` header, managed and validated by Firebase.
|
|
||||||
- **Data Format:** All requests and responses will be in `application/json` format.
|
|
||||||
- **Error Responses:** Errors will use standard HTTP status codes (400, 401, 403, 404, 500) and include a JSON response body of the form `{ "error": "Problem description" }`.
|
|
||||||
- **Common Fields:** Each entity will have the following fields, automatically managed by the backend:
|
|
||||||
- `id`: `string` (UUID, Primary Key)
|
|
||||||
- `created_date`: `string` (ISO 8601 Timestamp)
|
|
||||||
- `updated_date`: `string` (ISO 8601 Timestamp)
|
|
||||||
- `created_by`: `string` (Email of the creating user)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 1. Authentication (Firebase Auth)
|
|
||||||
|
|
||||||
Authentication will be handled entirely by Firebase Authentication. The client applications (web and mobile) are responsible for the sign-up and sign-in flows using the Firebase SDK. The backend will use the provided auth token to identify the user for all subsequent requests.
|
|
||||||
|
|
||||||
### `User` Entity (Managed by Firebase Auth & Data Connect)
|
|
||||||
|
|
||||||
| Field | Type | Description |
|
|
||||||
| ----------- | -------- | ----------------------------------------- |
|
|
||||||
| `id` | `string` | Firebase User ID (UID) |
|
|
||||||
| `email` | `string` | User's email (non-modifiable) |
|
|
||||||
| `full_name` | `string` | Full name |
|
|
||||||
| `user_role` | `string` | Custom application role (`admin`, `procurement`, `client`...) |
|
|
||||||
| `...other` | `any` | Other custom fields can be added. |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 2. Data API (Firebase Data Connect)
|
|
||||||
|
|
||||||
All entities below will be managed via a GraphQL API powered by Firebase Data Connect. For each entity, standard `query` (list, get by ID) and `mutation` (create, update, delete) operations will be defined in the `firebase/dataconnect/` directory.
|
|
||||||
|
|
||||||
### 2.1. Event
|
|
||||||
|
|
||||||
**Description:** Manages events and workforce orders.
|
|
||||||
|
|
||||||
| Field | Type | Description |
|
|
||||||
| ----------------------- | --------- | ------------------------------------------------ |
|
|
||||||
| `event_name` | `string` | Name of the event (required) |
|
|
||||||
| `is_recurring` | `boolean` | Indicates if the event is recurring |
|
|
||||||
| `recurrence_type` | `string` | `single`, `date_range`, `scatter` |
|
|
||||||
| `business_id` | `string` | ID of the client (`Business`) |
|
|
||||||
| `vendor_id` | `string` | ID of the provider (`Vendor`) |
|
|
||||||
| `status` | `string` | `Draft`, `Active`, `Pending`, `Assigned`, `Confirmed`, `Completed`, `Canceled` |
|
|
||||||
| `date` | `string` | Event date (ISO 8601) |
|
|
||||||
| `shifts` | `jsonb` | Array of `Shift` objects |
|
|
||||||
| `total` | `number` | Total cost |
|
|
||||||
| `requested` | `number` | Total number of staff requested |
|
|
||||||
| `assigned_staff` | `jsonb` | Array of assigned staff objects |
|
|
||||||
|
|
||||||
### 2.2. Staff
|
|
||||||
|
|
||||||
**Description:** Manages staff members.
|
|
||||||
|
|
||||||
| Field | Type | Description |
|
|
||||||
| ------------------------- | --------- | ----------------------------------------- |
|
|
||||||
| `employee_name` | `string` | Full name (required) |
|
|
||||||
| `vendor_id` | `string` | ID of the provider (`Vendor`) |
|
|
||||||
| `email` | `string` | Email address |
|
|
||||||
| `position` | `string` | Primary job position/skill |
|
|
||||||
| `employment_type` | `string` | `Full Time`, `Part Time`, `On call`, etc. |
|
|
||||||
| `rating` | `number` | Performance rating (0-5) |
|
|
||||||
| `reliability_score` | `number` | Reliability score (0-100) |
|
|
||||||
| `background_check_status` | `string` | `pending`, `cleared`, `failed`, `expired` |
|
|
||||||
| `certifications` | `jsonb` | List of certifications |
|
|
||||||
|
|
||||||
### 2.3. Vendor
|
|
||||||
|
|
||||||
**Description:** Manages providers and their onboarding.
|
|
||||||
|
|
||||||
| Field | Type | Description |
|
|
||||||
| ----------------------- | --------- | ----------------------------------------- |
|
|
||||||
| `vendor_number` | `string` | Vendor number (e.g., `VN-####`) |
|
|
||||||
| `legal_name` | `string` | Legal company name (required) |
|
|
||||||
| `primary_contact_email` | `string` | Primary contact email (required) |
|
|
||||||
| `approval_status` | `string` | `pending`, `approved`, `suspended`, `terminated` |
|
|
||||||
| `is_active` | `boolean` | Active status |
|
|
||||||
| `w9_document` | `string` | URL or URI of the W9 document |
|
|
||||||
| `coi_document` | `string` | URL or URI of the Certificate of Insurance|
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
*Note: For brevity, only the most critical entities have been detailed. The same structure (Schema defined in GraphQL) must be applied for all other entities: `VendorRate`, `Invoice`, `Business`, `Certification`, `Team`, `Conversation`, `Message`, `ActivityLog`, `Enterprise`, `Sector`, `Partner`, `Order`, and `Shift`, based on the `07-reference-base44-api-export.md` document.*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 3. Services API (Cloud Functions)
|
|
||||||
|
|
||||||
These endpoints are not for CRUD operations but for specific, service-oriented tasks. They will be implemented as individual HTTP-triggered Cloud Functions.
|
|
||||||
|
|
||||||
### `POST /sendEmail`
|
|
||||||
- **Description:** Sends an email.
|
|
||||||
- **Original SDK:** `base44.integrations.Core.SendEmail(params)`
|
|
||||||
- **Body:** `{ "to": "...", "subject": "...", "body": "..." }`
|
|
||||||
- **Response (200 OK):** `{ "status": "sent" }`
|
|
||||||
|
|
||||||
### `POST /invokeLLM`
|
|
||||||
- **Description:** Calls a large language model (Vertex AI).
|
|
||||||
- **Original SDK:** `base44.integrations.Core.InvokeLLM(params)`
|
|
||||||
- **Body:** `{ "prompt": "...", "response_json_schema": {...}, "file_urls": [...] }`
|
|
||||||
- **Response (200 OK):** `{ "result": "..." }`
|
|
||||||
|
|
||||||
### `POST /uploadFile`
|
|
||||||
- **Description:** Handles the upload of public files to Google Cloud Storage and returns a public URL.
|
|
||||||
- **Original SDK:** `base44.integrations.Core.UploadFile({ file })`
|
|
||||||
- **Request:** `multipart/form-data`.
|
|
||||||
- **Response (200 OK):** `{ "file_url": "https://..." }`
|
|
||||||
|
|
||||||
### `POST /uploadPrivateFile`
|
|
||||||
- **Description:** Handles the upload of private files to Google Cloud Storage and returns a secure URI.
|
|
||||||
- **Original SDK:** `base44.integrations.Core.UploadPrivateFile({ file })`
|
|
||||||
- **Request:** `multipart/form-data`.
|
|
||||||
- **Response (200 OK):** `{ "file_uri": "gs://..." }`
|
|
||||||
|
|
||||||
### `POST /createSignedUrl`
|
|
||||||
- **Description:** Creates a temporary access URL for a private file.
|
|
||||||
- **Original SDK:** `base44.integrations.Core.CreateFileSignedUrl(params)`
|
|
||||||
- **Body:** `{ "file_uri": "...", "expires_in": 3600 }`
|
|
||||||
- **Response (200 OK):** `{ "signed_url": "https://..." }`
|
|
||||||
@@ -1,41 +0,0 @@
|
|||||||
# KROW Technical Roadmap
|
|
||||||
|
|
||||||
This document outlines the technical strategy for building the new, autonomous KROW platform. It is structured in phases rather than fixed dates to maintain agility.
|
|
||||||
|
|
||||||
```mermaid
|
|
||||||
gantt
|
|
||||||
title KROW Platform Build Roadmap
|
|
||||||
dateFormat W
|
|
||||||
axisFormat Week %W
|
|
||||||
|
|
||||||
section Phase 1: Foundation & Dev Environment Setup
|
|
||||||
Infrastructure Setup : 1, 1w
|
|
||||||
GraphQL Schema Definition : 1, 1w
|
|
||||||
Data Connect Deployment (Dev): 2, 1w
|
|
||||||
SDK Generation & Web/Mobile PoC : 3, 1w
|
|
||||||
|
|
||||||
section Phase 2: Core Feature Implementation
|
|
||||||
Backend Logic (All Entities): 4, 4w
|
|
||||||
Web App Re-wiring : 4, 4w
|
|
||||||
Mobile Apps Re-wiring : 5, 4w
|
|
||||||
|
|
||||||
section Phase 3: Production Readiness & Go-Live
|
|
||||||
CI/CD Pipelines Setup : 9, 2w
|
|
||||||
Staging Env Deployment & E2E Testing : 10, 2w
|
|
||||||
Production Deployment & Data Import : 12, 1w
|
|
||||||
Monitoring & Security : 12, 1w
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Phase 1: Foundation & Dev Environment Setup (~3-4 Weeks)
|
|
||||||
* **Goal:** To have a fully functional, shared `dev` environment in the cloud. All developers can connect to it from their local machines.
|
|
||||||
* **Key Milestone:** The web app and a mobile app screen can successfully authenticate and fetch live data from the `dev` Firebase/GCP project.
|
|
||||||
|
|
||||||
## Phase 2: Core Feature Implementation (~5-6 Weeks)
|
|
||||||
* **Goal:** To achieve functional parity with the Base44 prototype across all three platforms, all powered by our shared `dev` backend.
|
|
||||||
* **Key Milestone:** The full lifecycle of core features (Event, Staff, Vendor management) is functional on all apps.
|
|
||||||
|
|
||||||
## Phase 3: Production Readiness & Go-Live (~4 Weeks)
|
|
||||||
* **Goal:** To automate, secure, and deploy the entire platform to production.
|
|
||||||
* **Key Milestone:** The KROW platform is live on production infrastructure. The team has a repeatable, automated process for future deployments.
|
|
||||||
@@ -1,95 +0,0 @@
|
|||||||
# KROW Project Plan & Task Breakdown
|
|
||||||
|
|
||||||
This document breaks down the technical roadmap into actionable tasks, assigned by role, ready to be converted into GitHub Issues.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 1: Foundation & Dev Environment Setup
|
|
||||||
|
|
||||||
*Goal: Establish a fully functional, shared `dev` environment on GCP/Firebase and validate that all core components (Web, Mobile, Backend) can be built, deployed, and connected.*
|
|
||||||
|
|
||||||
### Infrastructure & Tooling (Primarily CTO)
|
|
||||||
- **Issue:** `[Infra] Setup Enpass for Team Credential Management`
|
|
||||||
- **Description:** Configure the team's Enpass vault and establish the process for sharing secrets and service account keys.
|
|
||||||
- **Issue:** `[Infra] Create GCP/Firebase Projects (dev, staging, prod)`
|
|
||||||
- **Description:** Set up the three distinct Google Cloud projects and associated Firebase projects. Enable required APIs (Auth, Cloud SQL, Data Connect).
|
|
||||||
- **Issue:** `[Infra] Create Multi-Env Makefile`
|
|
||||||
- **Description:** Create the main `Makefile` to handle environment switching (`ENV=dev/staging`) and orchestrate all build/deploy tasks.
|
|
||||||
- **Issue:** `[Infra] Setup Shared Dev Database`
|
|
||||||
- **Description:** Provision the initial Cloud SQL for PostgreSQL instance for the `dev` environment.
|
|
||||||
|
|
||||||
### Backend & Web (Dev 1)
|
|
||||||
- **Epic:** `[Onboarding] End-to-End Flow Validation with 'Event' Entity`
|
|
||||||
- **Issue:** `[Backend] Define and Deploy 'Event' Schema`
|
|
||||||
- **Description:** Translate the `Event` schema from the API specification into `.gql` files. Define the basic `listEvents` query and `createEvent` mutation. Use the `Makefile` to deploy this to the `dev` environment and validate that the `events` table is created in Cloud SQL.
|
|
||||||
- **Issue:** `[Web] Generate TypeScript SDK for Dev Env`
|
|
||||||
- **Description:** Configure and run the SDK generation command to create the TypeScript SDK pointing to the `dev` environment.
|
|
||||||
- **Issue:** `[Web] Connect 'Events' Page to Dev Backend (PoC)`
|
|
||||||
- **Description:** Modify the main web application's `Events.jsx` page. Replace the existing mock/Base44 data fetching with the new TanStack Query hooks from the generated SDK to display a list of events from our own `dev` backend. This validates the full end-to-end workflow on a real feature.
|
|
||||||
|
|
||||||
- **Epic:** `[Backend] KROW Schema Implementation`
|
|
||||||
- **Issue:** `[Backend] Define GraphQL Schema for Remaining Core Entities`
|
|
||||||
- **Description:** Translate `Staff`, `Vendor`, `User`, and other core schemas from the API specification into `.gql` files and deploy them.
|
|
||||||
|
|
||||||
### Mobile (Dev 2)
|
|
||||||
- **Epic:** `[Mobile] Analysis & Documentation`
|
|
||||||
- **Issue:** `[Mobile-Doc] Analyze & Document Existing App Logic`
|
|
||||||
- **Description:** Review the legacy Flutter codebases to identify and document key business logic and user flows.
|
|
||||||
- **Issue:** `[Mobile-Doc] Create & Update Workflow Diagrams`
|
|
||||||
- **Description:** Based on the analysis, create or update Mermaid diagrams for critical workflows and add them to the internal launchpad.
|
|
||||||
|
|
||||||
- **Epic:** `[Mobile] CI/CD & Skeleton App Setup`
|
|
||||||
- **Issue:** `[Mobile-CI/CD] Configure CodeMagic & Firebase App Distribution`
|
|
||||||
- **Description:** Set up CodeMagic and configure build workflows for iOS/Android with automated deployment to Firebase App Distribution.
|
|
||||||
- **Issue:** `[Mobile-CI/CD] Initialize Skeleton Apps in Monorepo`
|
|
||||||
- **Description:** Create new, clean Flutter projects for `client-app` and `staff-app` within the `mobile-apps` directory.
|
|
||||||
- **Issue:** `[Mobile-CI/CD] Implement Initial CI/CD Pipeline`
|
|
||||||
- **Description:** Create a "Hello World" version of the Staff app and validate that it can be automatically built and deployed to App Distribution.
|
|
||||||
|
|
||||||
- **Epic:** `[Mobile] Backend Integration Validation`
|
|
||||||
- **Issue:** `[Mobile-Auth] Implement Firebase Auth Flow in Skeleton App`
|
|
||||||
- **Description:** Add Firebase Authentication to the skeleton Staff app and ensure users can sign up/log in against the `dev` project.
|
|
||||||
- **Issue:** `[Mobile-Backend] Generate Flutter SDK for Dev Env`
|
|
||||||
- **Description:** Configure and run the SDK generation command to create the Flutter SDK for the `dev` environment.
|
|
||||||
- **Issue:** `[Mobile-Backend] Create Proof-of-Concept Screen`
|
|
||||||
- **Description:** Build a simple screen in the skeleton Staff app that, after login, fetches and displays a list of events from the `dev` backend using the new SDK.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 2: Core Feature Implementation
|
|
||||||
|
|
||||||
*Goal: Achieve functional parity with the Base44 prototype across all platforms, using the shared `dev` backend.*
|
|
||||||
|
|
||||||
### Backend (Dev 1)
|
|
||||||
- **Epic:** `[Backend] Implement Full API Logic`
|
|
||||||
- **Description:** Create all necessary GraphQL queries and mutations in Data Connect for all entities. Deploy them continuously to the `dev` environment.
|
|
||||||
|
|
||||||
### Web (Dev 1, with support from Dev 2)
|
|
||||||
- **Epic:** `[Web] Full Application Re-wiring`
|
|
||||||
- **Description:** Systematically replace all data-fetching logic in the web app to use the TanStack Query hooks from the generated Data Connect SDK.
|
|
||||||
|
|
||||||
### Mobile (Dev 2)
|
|
||||||
- **Epic:** `[Mobile] Port Features to New Apps`
|
|
||||||
- **Description:** Systematically port the features and UI from the legacy apps into the new, clean skeleton apps, connecting them to the Data Connect backend via the generated SDK.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 3: Production Readiness & Go-Live
|
|
||||||
|
|
||||||
*Goal: Automate, secure, and deploy the entire platform to production.*
|
|
||||||
|
|
||||||
### Infrastructure & DevOps (CTO & Team)
|
|
||||||
- **Issue:** `[CI/CD] Configure Web App Deployment Pipeline`
|
|
||||||
- **Description:** Set up a GitHub Actions pipeline to build and deploy the web app to Firebase Hosting (`staging` and `prod`).
|
|
||||||
- **Issue:** `[CI/CD] Finalize Production Mobile Deployment`
|
|
||||||
- **Description:** Finalize the CodeMagic pipelines for deployment to TestFlight/Play Store production tracks.
|
|
||||||
- **Issue:** `[CI/CD] Configure Backend Deployment Pipeline`
|
|
||||||
- **Description:** Automate the deployment of the Data Connect schema and operations.
|
|
||||||
- **Issue:** `[Data] Create & Test Initial Data Import Scripts`
|
|
||||||
- **Description:** Write scripts to populate the production database with any necessary initial data.
|
|
||||||
- **Issue:** `[QA] Deploy to Staging & Perform E2E Testing`
|
|
||||||
- **Description:** Use the `Makefile` (`make deploy ENV=staging`) to deploy the entire stack to the staging environment for full end-to-end testing.
|
|
||||||
- **Issue:** `[Ops] Final Production Deployment`
|
|
||||||
- **Description:** Run the production deployment (`make deploy ENV=prod`) and execute data import scripts.
|
|
||||||
- **Issue:** `[Ops] Setup Monitoring & Alerting`
|
|
||||||
- **Description:** Configure monitoring dashboards in Google Cloud for the database, API, and application performance.
|
|
||||||
@@ -1,55 +0,0 @@
|
|||||||
# API Documentation Maintenance Guide
|
|
||||||
|
|
||||||
This document describes the procedure for updating the API documentation and our backend's technical specification after major changes are made on the Base44 platform.
|
|
||||||
|
|
||||||
Following this process is **essential** to ensure that our custom backend on GCP remains synchronized with the frontend's features.
|
|
||||||
|
|
||||||
## When to Follow This Procedure
|
|
||||||
|
|
||||||
You should follow this guide after each significant development cycle on the Base44 platform, especially after:
|
|
||||||
- Adding new entities or data fields.
|
|
||||||
- Modifying existing business logic.
|
|
||||||
- Integrating major new features into the user interface.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Update Procedure
|
|
||||||
|
|
||||||
### Step 1: Obtain Updated Documentation from Base44
|
|
||||||
|
|
||||||
1. **Open the `docs/08-reference-base44-prompts.md` file**.
|
|
||||||
2. Copy the content of the **"Main Prompt"**.
|
|
||||||
3. Paste this prompt into the Base44 AI chat to request the latest documentation.
|
|
||||||
4. **Verification:** The AI should return the full content of the `base44-api-export.md` file. If it only returns a summary, use the following simple prompt to request the full content:
|
|
||||||
```text
|
|
||||||
Thank you for the summary. Please provide the entire, updated content of the API documentation file now.
|
|
||||||
```
|
|
||||||
|
|
||||||
### Step 2: Update the Local Documentation File (with Gemini CLI)
|
|
||||||
|
|
||||||
To ensure clean and consistent formatting, it is recommended to use the Gemini CLI for this step.
|
|
||||||
|
|
||||||
1. **Copy the raw content** provided by the Base44 AI.
|
|
||||||
2. **Provide this content to the Gemini CLI** with a simple prompt, for example:
|
|
||||||
> "Here is the new Base44 API documentation. Can you reformat this content and update the `docs/07-reference-base44-api-export.md` file?"
|
|
||||||
3. **Let the Gemini CLI** handle the file creation or update. It will ensure that tables, code blocks, and headers are correctly formatted.
|
|
||||||
|
|
||||||
### Step 3: Update the GCP API Specification (with Gemini CLI)
|
|
||||||
|
|
||||||
This is the most critical step. Instead of a tedious manual comparison, we will rely on the AI to synchronize our migration plan.
|
|
||||||
|
|
||||||
1. **Ensure Step 2 is complete** and that `docs/07-reference-base44-api-export.md` is up-to-date.
|
|
||||||
2. **Ask the Gemini CLI** to update the specification for you. Use a clear prompt, for example:
|
|
||||||
> "Now that `docs/07-reference-base44-api-export.md` is updated, can you analyze the changes and comprehensively update the `docs/03-backend-api-specification.md` file to match?"
|
|
||||||
3. **Let the Gemini CLI** perform the comparative analysis and apply the necessary changes (adding fields, entities, integrations, etc.) to the specification file.
|
|
||||||
|
|
||||||
### Step 4: Validate and Commit the Changes
|
|
||||||
|
|
||||||
1. Give the changes in `03-backend-api-specification.md` a final review to ensure they are consistent.
|
|
||||||
2. Commit the updated files to Git with a clear and descriptive message.
|
|
||||||
```bash
|
|
||||||
git add docs/
|
|
||||||
git commit -m "docs: Update API documentation and specification from Base44 export"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
@@ -1,705 +0,0 @@
|
|||||||
# KROW Workforce Platform - API Documentation
|
|
||||||
|
|
||||||
**Version:** 3.0 (Auto-Updated)
|
|
||||||
**Last Updated:** 2025-11-20
|
|
||||||
**Project:** KROW Workforce Control Tower
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Table of Contents
|
|
||||||
1. [Overview](#overview)
|
|
||||||
2. [Authentication](#authentication)
|
|
||||||
3. [Entity Schemas](#entity-schemas)
|
|
||||||
4. [SDK Operations](#sdk-operations)
|
|
||||||
5. [Core Integrations](#core-integrations)
|
|
||||||
6. [Data Models Reference](#data-models-reference)
|
|
||||||
7. [Code Examples](#code-examples)
|
|
||||||
8. [Best Practices](#best-practices)
|
|
||||||
9. [Security Considerations](#security-considerations)
|
|
||||||
10. [Rate Limits & Quotas](#rate-limits--quotas)
|
|
||||||
11. [Changelog](#changelog)
|
|
||||||
12. [Support & Resources](#support--resources)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 1. Overview
|
|
||||||
KROW Workforce is a comprehensive workforce management platform built on Base44. This documentation provides complete API specifications for all entities, SDK methods, and integration endpoints.
|
|
||||||
|
|
||||||
### Base44 Client Import
|
|
||||||
```javascript
|
|
||||||
import { base44 } from "@/api/base44Client";
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 2. Authentication
|
|
||||||
|
|
||||||
### User Authentication Methods
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
// Get current authenticated user
|
|
||||||
const user = await base44.auth.me();
|
|
||||||
|
|
||||||
// Update current user
|
|
||||||
await base44.auth.updateMe({
|
|
||||||
full_name: "John Doe",
|
|
||||||
custom_field: "value"
|
|
||||||
});
|
|
||||||
|
|
||||||
// Logout
|
|
||||||
base44.auth.logout(redirectUrl?: string);
|
|
||||||
|
|
||||||
// Redirect to login
|
|
||||||
base44.auth.redirectToLogin(nextUrl?: string);
|
|
||||||
|
|
||||||
// Check authentication status
|
|
||||||
const isAuthenticated = await base44.auth.isAuthenticated();
|
|
||||||
```
|
|
||||||
|
|
||||||
### User Object Structure
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"id": "string",
|
|
||||||
"email": "string",
|
|
||||||
"full_name": "string",
|
|
||||||
"role": "admin | user",
|
|
||||||
"user_role": "string",
|
|
||||||
"created_date": "timestamp",
|
|
||||||
"updated_date": "timestamp"
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 3. Entity Schemas
|
|
||||||
|
|
||||||
### 1. User Entity (Built-in)
|
|
||||||
**Description:** Core user entity with authentication and role management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique user identifier | Auto-generated, unique |
|
|
||||||
| `email` | string | User email address | Required, unique, email format |
|
|
||||||
| `full_name` | string | User's full name | Required |
|
|
||||||
| `role` | string | Base role | Enum: "admin", "user" |
|
|
||||||
| `user_role` | string | Custom application role | Optional, custom values |
|
|
||||||
| `created_date` | timestamp | Account creation date | Auto-generated |
|
|
||||||
| `updated_date` | timestamp | Last update timestamp | Auto-updated |
|
|
||||||
|
|
||||||
**Security Rules:**
|
|
||||||
* Only admin users can list, update, or delete other users.
|
|
||||||
* Regular users can only view and update their own user record.
|
|
||||||
|
|
||||||
### 2. Event Entity
|
|
||||||
**Description:** Core event/order management entity for workforce scheduling.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique event identifier | Auto-generated |
|
|
||||||
| `event_name` | string | Name of the event | Required |
|
|
||||||
| `is_rapid` | boolean | RAPID/urgent order flag | Default: false |
|
|
||||||
| `is_recurring` | boolean | Whether event recurs | Default: false |
|
|
||||||
| `is_multi_day` | boolean | Multi-day event flag | Default: false |
|
|
||||||
| `recurrence_type` | string | Type of recurrence | Enum: "single", "date_range", "scatter" |
|
|
||||||
| `recurrence_start_date`| date | Start date for recurring events | Optional |
|
|
||||||
| `recurrence_end_date` | date | End date for recurring events | Optional |
|
|
||||||
| `scatter_dates` | array | Specific dates for scatter recurring | Array of date strings |
|
|
||||||
| `multi_day_start_date` | date | Multi-day start date | Optional |
|
|
||||||
| `multi_day_end_date` | date | Multi-day end date | Optional |
|
|
||||||
| `buffer_time_before` | number | Buffer time before (minutes) | Default: 0 |
|
|
||||||
| `buffer_time_after` | number | Buffer time after (minutes) | Default: 0 |
|
|
||||||
| `conflict_detection_enabled`| boolean | Enable conflict detection | Default: true |
|
|
||||||
| `detected_conflicts` | array | Array of detected conflicts | Array of conflict objects |
|
|
||||||
| `business_id` | string | Associated business ID | Optional |
|
|
||||||
| `business_name` | string | Business name | Optional |
|
|
||||||
| `vendor_id` | string | Vendor ID if created by vendor | Optional |
|
|
||||||
| `vendor_name` | string | Vendor name | Optional |
|
|
||||||
| `hub` | string | Hub location | Optional |
|
|
||||||
| `event_location` | string | Event location address | Optional |
|
|
||||||
| `contract_type` | string | Contract type | Enum: "W2", "1099", "Temp", "Contract" |
|
|
||||||
| `po_reference` | string | Purchase order reference | Optional |
|
|
||||||
| `status` | string | Event status | Enum: "Draft", "Active", "Pending", "Assigned", "Confirmed", "Completed", "Canceled" |
|
|
||||||
| `date` | date | Event date | Optional |
|
|
||||||
| `shifts` | array | Array of shift objects | Optional |
|
|
||||||
| `addons` | object | Additional services/features | Optional |
|
|
||||||
| `total` | number | Total cost | Optional |
|
|
||||||
| `client_name` | string | Client contact name | Optional |
|
|
||||||
| `client_email` | string | Client email | Optional |
|
|
||||||
| `client_phone` | string | Client phone | Optional |
|
|
||||||
| `invoice_id` | string | Associated invoice ID | Optional |
|
|
||||||
| `notes` | string | Additional notes | Optional |
|
|
||||||
| `requested` | number | Total staff requested | Default: 0 |
|
|
||||||
| `assigned_staff` | array | Array of assigned staff | Optional |
|
|
||||||
|
|
||||||
**Detected Conflicts Structure:**
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"conflict_type": "staff_overlap" | "venue_overlap" | "time_buffer",
|
|
||||||
"severity": "low" | "medium" | "high" | "critical",
|
|
||||||
"description": "string",
|
|
||||||
"conflicting_event_id": "string",
|
|
||||||
"staff_id": "string",
|
|
||||||
"detected_at": "timestamp"
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Staff Entity
|
|
||||||
**Description:** Employee/workforce member management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique staff identifier | Auto-generated |
|
|
||||||
| `employee_name` | string | Full name | Required |
|
|
||||||
| `vendor_id` | string | Associated vendor ID | Optional |
|
|
||||||
| `vendor_name` | string | Vendor company name | Optional |
|
|
||||||
| `manager` | string | Manager's name | Optional |
|
|
||||||
| `contact_number` | string | Primary contact number | Optional |
|
|
||||||
| `email` | string | Email address | Email format |
|
|
||||||
| `department` | string | Department | Enum: "Operations", "Sales", "HR", "Finance", "IT", "Marketing", "Customer Service", "Logistics" |
|
|
||||||
| `hub_location` | string | Hub/office location | Optional |
|
|
||||||
| `track` | string | Track information | Optional |
|
|
||||||
| `position` | string | Primary job position/skill | Optional |
|
|
||||||
| `profile_type` | string | Skill profile level | Enum: "Skilled", "Beginner", "Cross-Trained" |
|
|
||||||
| `employment_type` | string | Employment type | Enum: "Full Time", "Part Time", "On call", "Weekends", "Specific Days", "Seasonal", "Medical Leave" |
|
|
||||||
| `english` | string | English proficiency | Enum: "Fluent", "Intermediate", "Basic", "None" |
|
|
||||||
| `rating` | number | Staff performance rating (0-5 stars) | Min: 0, Max: 5 |
|
|
||||||
| `reliability_score` | number | Overall reliability score (0-100) | Min: 0, Max: 100 |
|
|
||||||
| `background_check_status`| string | Background check status | Enum: "pending", "cleared", "failed", "expired", "not_required" |
|
|
||||||
|
|
||||||
### 4. Vendor Entity
|
|
||||||
**Description:** Vendor/supplier management and onboarding.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique vendor identifier | Auto-generated |
|
|
||||||
| `vendor_number` | string | Vendor Number (VN-####) | Required, Pattern: `^VN-[0-9]{4}$` |
|
|
||||||
| `legal_name` | string | Legal business name | Required |
|
|
||||||
| `region` | string | Geographic region | Enum: "National", "Bay Area", "Southern California", "Northern California", "West", "East", "Midwest", "South" |
|
|
||||||
| `platform_type` | string | Technology integration level | Enum: "Full Platform", "Building platform (KROW)", "Partial Tech", "Traditional" |
|
|
||||||
| `primary_contact_email` | string | Primary email | Required, Email format |
|
|
||||||
| `approval_status` | string | Vendor approval status | Enum: "pending", "approved", "suspended", "terminated" |
|
|
||||||
| `is_active` | boolean | Is vendor currently active | Default: true |
|
|
||||||
|
|
||||||
### 5. VendorRate Entity
|
|
||||||
**Description:** Vendor pricing and rate management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique rate identifier | Auto-generated |
|
|
||||||
| `vendor_name` | string | Vendor name | Required |
|
|
||||||
| `category` | string | Service category | Enum: "Kitchen and Culinary", "Concessions", "Facilities", "Bartending", "Security", "Event Staff", "Management", "Technical", "Other" |
|
|
||||||
| `role_name` | string | Role/position name | Required |
|
|
||||||
| `employee_wage` | number | Employee base wage/hour | Required, Min: 0 |
|
|
||||||
| `markup_percentage` | number | Markup percentage | Min: 0, Max: 100 |
|
|
||||||
| `vendor_fee_percentage` | number | Vendor fee percentage | Min: 0, Max: 100 |
|
|
||||||
| `client_rate` | number | Final rate to client | Required, Min: 0 |
|
|
||||||
|
|
||||||
### 6. VendorDefaultSettings Entity
|
|
||||||
**Description:** Default markup and fee settings for vendors.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique settings identifier | Auto-generated |
|
|
||||||
| `vendor_name` | string | Name of the vendor | Required |
|
|
||||||
| `default_markup_percentage`| number | Default markup percentage | Required, Min: 0, Max: 100 |
|
|
||||||
| `default_vendor_fee_percentage`| number | Default vendor fee percentage | Required, Min: 0, Max: 100 |
|
|
||||||
|
|
||||||
### 7. Invoice Entity
|
|
||||||
**Description:** Invoice and billing management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique invoice identifier | Auto-generated |
|
|
||||||
| `invoice_number` | string | Unique invoice number | Required |
|
|
||||||
| `amount` | number | Grand total invoice amount | Required, Min: 0 |
|
|
||||||
| `status` | string | Current invoice status | Enum: "Draft", "Pending Review", "Approved", "Disputed", "Under Review", "Resolved", "Overdue", "Paid", "Reconciled", "Cancelled" |
|
|
||||||
| `issue_date` | date | Invoice issue date | Required |
|
|
||||||
| `due_date` | date | Payment due date | Required |
|
|
||||||
| `disputed_items` | array | List of disputed staff entry indices | Optional |
|
|
||||||
| `is_auto_generated` | boolean | Whether invoice was auto-generated | Default: false |
|
|
||||||
|
|
||||||
### 8. Business Entity
|
|
||||||
**Description:** Client business/company management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique business identifier | Auto-generated |
|
|
||||||
| `business_name` | string | Business/client company name | Required |
|
|
||||||
| `contact_name` | string | Primary contact person | Required |
|
|
||||||
| `email` | string | Business email | Email format |
|
|
||||||
| `sector` | string | Sector/industry | Enum: "Bon Appétit", "Eurest", "Aramark", "Epicurean Group", "Chartwells", "Other" |
|
|
||||||
| `rate_group` | string | Pricing tier | Required, Enum: "Standard", "Premium", "Enterprise", "Custom" |
|
|
||||||
| `status` | string | Business status | Enum: "Active", "Inactive", "Pending" |
|
|
||||||
|
|
||||||
### 9. Certification Entity
|
|
||||||
**Description:** Employee certification and compliance tracking.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique certification ID | Auto-generated |
|
|
||||||
| `employee_name` | string | Staff member name | Required |
|
|
||||||
| `certification_name` | string | Certification name | Required |
|
|
||||||
| `certification_type` | string | Type of certification | Enum: "Legal", "Operational", "Safety", "Training", "License", "Other" |
|
|
||||||
| `status` | string | Current status | Enum: "current", "expiring_soon", "expired", "pending_validation" |
|
|
||||||
| `expiry_date` | date | Expiration date | Required |
|
|
||||||
| `validation_status` | string | Validation status | Enum: "approved", "pending_expert_review", "rejected", "ai_verified", "ai_flagged", "manual_review_needed" |
|
|
||||||
|
|
||||||
### 10. Team Entity
|
|
||||||
**Description:** Team and organization management with role-based isolation.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique team identifier | Auto-generated |
|
|
||||||
| `team_name` | string | Name of the team | Required |
|
|
||||||
| `owner_id` | string | Team owner user ID | Required |
|
|
||||||
| `owner_name` | string | Team owner name | Required |
|
|
||||||
| `owner_role` | string | Role of team owner | Required, Enum: "admin", "procurement", "operator", "sector", "client", "vendor", "workforce" |
|
|
||||||
| `favorite_staff` | array | Array of favorite staff | Array of objects |
|
|
||||||
| `blocked_staff` | array | Array of blocked staff | Array of objects |
|
|
||||||
|
|
||||||
### 11. TeamMember Entity
|
|
||||||
**Description:** Team member management within teams.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique member identifier | Auto-generated |
|
|
||||||
| `team_id` | string | ID of the team this member belongs to | Required |
|
|
||||||
| `member_name` | string | Name of the team member | Required |
|
|
||||||
| `email` | string | Member email | Required, Email format |
|
|
||||||
| `role` | string | Role in the team | Enum: "admin", "manager", "member", "viewer" |
|
|
||||||
| `is_active` | boolean | Whether the member is active | Default: true |
|
|
||||||
|
|
||||||
### 12. TeamHub Entity
|
|
||||||
**Description:** Team hub/location management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique hub identifier | Auto-generated |
|
|
||||||
| `team_id` | string | ID of the team | Required |
|
|
||||||
| `hub_name` | string | Name of the hub/location | Required |
|
|
||||||
| `departments` | array | Departments within this hub | Array of objects with `department_name`, `cost_center` |
|
|
||||||
|
|
||||||
### 13. TeamMemberInvite Entity
|
|
||||||
**Description:** Team member invitation management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique invite identifier | Auto-generated |
|
|
||||||
| `team_id` | string | Team ID | Required |
|
|
||||||
| `invite_code` | string | Unique invite code | Required |
|
|
||||||
| `email` | string | Invitee email | Required, Email format |
|
|
||||||
| `invite_status` | string | Invite status | Enum: "pending", "accepted", "expired", "cancelled" |
|
|
||||||
|
|
||||||
### 14. Conversation Entity
|
|
||||||
**Description:** Messaging and communication management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique conversation ID | Auto-generated |
|
|
||||||
| `participants` | array | Array of participant IDs/emails | Required |
|
|
||||||
| `conversation_type` | string | Type of conversation | Required, Enum: "client-vendor", "staff-client", "staff-admin", "vendor-admin", "client-admin", "group-staff", "group-event-staff" |
|
|
||||||
| `related_to` | string | ID of related entity | Optional |
|
|
||||||
| `status` | string | Conversation status | Enum: "active", "archived", "closed" |
|
|
||||||
|
|
||||||
### 15. Message Entity
|
|
||||||
**Description:** Individual messages within conversations.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique message identifier | Auto-generated |
|
|
||||||
| `conversation_id` | string | ID of the conversation | Required |
|
|
||||||
| `sender_name` | string | Name of the sender | Required |
|
|
||||||
| `content` | string | Message content | Required |
|
|
||||||
| `read_by` | array | Array of user IDs who have read the message | Array of strings |
|
|
||||||
|
|
||||||
### 16. ActivityLog Entity
|
|
||||||
**Description:** Activity and notification tracking.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique activity identifier | Auto-generated |
|
|
||||||
| `title` | string | Notification title | Required |
|
|
||||||
| `description` | string | Detailed description | Required |
|
|
||||||
| `activity_type` | string | Type of activity | Required, Enum: "event_created", "event_updated", "staff_assigned", "invoice_paid", etc. |
|
|
||||||
| `user_id` | string | ID of the user this notification is for | Required |
|
|
||||||
| `is_read` | boolean | Whether the notification has been read | Default: false |
|
|
||||||
|
|
||||||
### 17. Enterprise Entity
|
|
||||||
**Description:** Enterprise organization management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique enterprise ID | Auto-generated |
|
|
||||||
| `enterprise_number`| string | Enterprise Number format EN-#### | Required, Pattern: `^EN-[0-9]{4}$` |
|
|
||||||
| `enterprise_name` | string | Enterprise name (e.g., Compass) | Required |
|
|
||||||
| `enterprise_code` | string | Short code identifier | Required |
|
|
||||||
|
|
||||||
### 18. Sector Entity
|
|
||||||
**Description:** Sector/branch management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique sector identifier | Auto-generated |
|
|
||||||
| `sector_number` | string | Sector Number format SN-#### | Required, Pattern: `^SN-[0-9]{4}$` |
|
|
||||||
| `sector_name` | string | Sector/brand name (e.g., Bon Appétit) | Required |
|
|
||||||
| `sector_type` | string | Sector business type | Enum: "Food Service", "Facilities", "Healthcare", "Education", "Corporate", "Sports & Entertainment" |
|
|
||||||
|
|
||||||
### 19. Partner Entity
|
|
||||||
**Description:** Partner/client organization management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique partner identifier | Auto-generated |
|
|
||||||
| `partner_name` | string | Partner/client name | Required |
|
|
||||||
| `partner_number` | string | Partner Number format PN-#### | Required, Pattern: `^PN-[0-9]{4}$` |
|
|
||||||
| `partner_type` | string | Partner type | Enum: "Corporate", "Education", "Healthcare", "Sports & Entertainment", "Government" |
|
|
||||||
|
|
||||||
### 20. Order Entity
|
|
||||||
**Description:** Order management system.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique order identifier | Auto-generated |
|
|
||||||
| `order_number` | string | Order Number format ORD-#### | Required, Pattern: `^ORD-[0-9]{4,6}$` |
|
|
||||||
| `partner_id` | string | Partner/Client ID | Required |
|
|
||||||
| `order_type` | string | Type of order | Enum: "Standard", "Last Minute", "Emergency", "Recurring" |
|
|
||||||
| `order_status` | string | Order status | Enum: "Draft", "Submitted", "Confirmed", "In Progress", "Completed", "Cancelled" |
|
|
||||||
|
|
||||||
### 21. Shift Entity
|
|
||||||
**Description:** Shift scheduling and management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique shift identifier | Auto-generated |
|
|
||||||
| `shift_name` | string | Name of the shift | Required |
|
|
||||||
| `start_date` | timestamp | Shift start date/time | Required |
|
|
||||||
| `end_date` | timestamp | Shift end date/time | Optional |
|
|
||||||
| `assigned_staff` | array | List of assigned staff | Array of objects |
|
|
||||||
|
|
||||||
### 22. Assignment Entity
|
|
||||||
**Description:** Worker assignment and tracking.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique assignment identifier | Auto-generated |
|
|
||||||
| `assignment_number` | string | Assignment Number format ASN-#### | Pattern: `^ASN-[0-9]{4,6}$` |
|
|
||||||
| `order_id` | string | Associated order ID | Required |
|
|
||||||
| `workforce_id` | string | Assigned worker ID | Required |
|
|
||||||
| `vendor_id` | string | Vendor providing the worker | Required |
|
|
||||||
| `role` | string | Role assigned | Required |
|
|
||||||
| `assignment_status` | string | Assignment status | Enum: "Pending", "Confirmed", "Checked In", "In Progress", "Completed", "Cancelled", "No Show" |
|
|
||||||
| `scheduled_start` | timestamp | Scheduled start time | Required |
|
|
||||||
|
|
||||||
### 23. Workforce Entity
|
|
||||||
**Description:** Worker/contractor management.
|
|
||||||
|
|
||||||
| Field Name | Type | Description | Validation |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `id` | string | Unique workforce identifier | Auto-generated |
|
|
||||||
| `workforce_number` | string | Worker Number format WF-#### | Required, Pattern: `^WF-[0-9]{4,6}$` |
|
|
||||||
| `vendor_id` | string | Vendor who manages this worker | Required |
|
|
||||||
| `first_name` | string | Worker first name | Required |
|
|
||||||
| `last_name` | string | Worker last name | Required |
|
|
||||||
| `employment_type` | string | Employment classification | Enum: "W2", "1099", "Temporary", "Contract" |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 4. SDK Operations
|
|
||||||
|
|
||||||
All entities support the following base operations. Replace `EntityName` with the specific entity (e.g., `Event`, `Staff`, `Invoice`).
|
|
||||||
|
|
||||||
### List & Filter
|
|
||||||
```javascript
|
|
||||||
// List all records (default limit: 50)
|
|
||||||
const records = await base44.entities.EntityName.list();
|
|
||||||
|
|
||||||
// List with sorting (descending by created_date)
|
|
||||||
const records = await base44.entities.EntityName.list('-created_date');
|
|
||||||
|
|
||||||
// Filter with multiple conditions
|
|
||||||
const records = await base44.entities.EntityName.filter({
|
|
||||||
status: 'Active',
|
|
||||||
created_by: user.email
|
|
||||||
});
|
|
||||||
|
|
||||||
// Filter with operators ($gte, $lte, $in, $contains)
|
|
||||||
const records = await base44.entities.EntityName.filter({
|
|
||||||
rating: { $gte: 4.5 },
|
|
||||||
total: { $lte: 1000 }
|
|
||||||
});
|
|
||||||
```
|
|
||||||
|
|
||||||
### CRUD Operations
|
|
||||||
```javascript
|
|
||||||
// Create
|
|
||||||
const newRecord = await base44.entities.EntityName.create({
|
|
||||||
field1: 'value1',
|
|
||||||
field2: 'value2'
|
|
||||||
});
|
|
||||||
|
|
||||||
// Bulk Create
|
|
||||||
const newRecords = await base44.entities.EntityName.bulkCreate([
|
|
||||||
{ field1: 'value1' },
|
|
||||||
{ field1: 'value2' }
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Update
|
|
||||||
const updatedRecord = await base44.entities.EntityName.update(recordId, {
|
|
||||||
field1: 'new value'
|
|
||||||
});
|
|
||||||
|
|
||||||
// Delete
|
|
||||||
await base44.entities.EntityName.delete(recordId);
|
|
||||||
|
|
||||||
// Get Schema
|
|
||||||
const schema = await base44.entities.EntityName.schema();
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 5. Core Integrations
|
|
||||||
|
|
||||||
### InvokeLLM
|
|
||||||
Generates a response from an LLM with a prompt.
|
|
||||||
|
|
||||||
| Parameter | Type | Required | Description |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `prompt` | string | Yes | The prompt to send to the LLM |
|
|
||||||
| `add_context_from_internet` | boolean | No | Fetch context from Google Search/Maps/News |
|
|
||||||
| `response_json_schema` | object | No | JSON schema for structured output |
|
|
||||||
| `file_urls` | string[] | No | Array of file URLs for context |
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
const data = await base44.integrations.Core.InvokeLLM({
|
|
||||||
prompt: "Give me data on Apple",
|
|
||||||
add_context_from_internet: true,
|
|
||||||
response_json_schema: {
|
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
stock_price: { type: "number" },
|
|
||||||
ceo: { type: "string" }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
```
|
|
||||||
|
|
||||||
### SendEmail
|
|
||||||
Send an email to a user.
|
|
||||||
|
|
||||||
| Parameter | Type | Required | Description |
|
|
||||||
| :--- | :--- | :--- | :--- |
|
|
||||||
| `to` | string | Yes | Recipient email address |
|
|
||||||
| `subject` | string | Yes | Email subject line |
|
|
||||||
| `body` | string | Yes | Email body (supports HTML) |
|
|
||||||
| `from_name` | string | No | Sender name |
|
|
||||||
|
|
||||||
### File Operations
|
|
||||||
|
|
||||||
* **UploadFile:** Upload a file to public storage.
|
|
||||||
* **UploadPrivateFile:** Upload a file to private storage (requires signed URL).
|
|
||||||
* **CreateFileSignedUrl:** Create a temporary signed URL for accessing a private file.
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
// Upload private document
|
|
||||||
const { file_uri } = await base44.integrations.Core.UploadPrivateFile({
|
|
||||||
file: sensitiveDocument
|
|
||||||
});
|
|
||||||
|
|
||||||
// Create signed URL
|
|
||||||
const { signed_url } = await base44.integrations.Core.CreateFileSignedUrl({
|
|
||||||
file_uri: file_uri,
|
|
||||||
expires_in: 3600
|
|
||||||
});
|
|
||||||
```
|
|
||||||
|
|
||||||
### ExtractDataFromUploadedFile
|
|
||||||
Extract structured data from uploaded files (CSV, PDF, images).
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
const result = await base44.integrations.Core.ExtractDataFromUploadedFile({
|
|
||||||
file_url: file_url,
|
|
||||||
json_schema: {
|
|
||||||
type: "array",
|
|
||||||
items: {
|
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
employee_name: { type: "string" },
|
|
||||||
email: { type: "string" }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 6. Data Models Reference
|
|
||||||
|
|
||||||
### Complete Event Object Example
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"id": "evt_1234567890",
|
|
||||||
"event_name": "Google Campus Lunch Service",
|
|
||||||
"is_recurring": true,
|
|
||||||
"status": "Active",
|
|
||||||
"date": "2025-01-15",
|
|
||||||
"shifts": [
|
|
||||||
{
|
|
||||||
"shift_name": "Lunch Shift",
|
|
||||||
"roles": [
|
|
||||||
{
|
|
||||||
"role": "Server",
|
|
||||||
"count": 10,
|
|
||||||
"hours": 4,
|
|
||||||
"cost_per_hour": 25,
|
|
||||||
"total_value": 1000
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"assigned_staff": [
|
|
||||||
{
|
|
||||||
"staff_id": "stf_111",
|
|
||||||
"staff_name": "Maria Garcia",
|
|
||||||
"confirmed": true
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 7. Code Examples
|
|
||||||
|
|
||||||
### Example: Create Event with Staff Assignment
|
|
||||||
```javascript
|
|
||||||
import { base44 } from "@/api/base44Client";
|
|
||||||
|
|
||||||
async function createEventAndAssignStaff() {
|
|
||||||
const user = await base44.auth.me();
|
|
||||||
|
|
||||||
// 1. Create Event
|
|
||||||
const event = await base44.entities.Event.create({
|
|
||||||
event_name: "Corporate Holiday Party",
|
|
||||||
date: "2025-12-20",
|
|
||||||
status: "Draft",
|
|
||||||
requested: 20,
|
|
||||||
shifts: [{
|
|
||||||
shift_name: "Evening Service",
|
|
||||||
roles: [{ role: "Server", count: 15, cost_per_hour: 28 }]
|
|
||||||
}],
|
|
||||||
client_email: user.email
|
|
||||||
});
|
|
||||||
|
|
||||||
// 2. Find Staff
|
|
||||||
const availableStaff = await base44.entities.Staff.filter({
|
|
||||||
position: "Server",
|
|
||||||
rating: { $gte: 4.5 }
|
|
||||||
}, '-rating', 15);
|
|
||||||
|
|
||||||
// 3. Assign Staff
|
|
||||||
const assignedStaff = availableStaff.map(staff => ({
|
|
||||||
staff_id: staff.id,
|
|
||||||
staff_name: staff.employee_name,
|
|
||||||
role: "Server",
|
|
||||||
confirmed: false
|
|
||||||
}));
|
|
||||||
|
|
||||||
await base44.entities.Event.update(event.id, {
|
|
||||||
assigned_staff: assignedStaff,
|
|
||||||
status: "Pending"
|
|
||||||
});
|
|
||||||
|
|
||||||
return event;
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Example: AI-Powered Staff Recommendation
|
|
||||||
```javascript
|
|
||||||
import { base44 } from "@/api/base44Client";
|
|
||||||
|
|
||||||
async function getStaffRecommendations(eventId) {
|
|
||||||
const events = await base44.entities.Event.filter({ id: eventId });
|
|
||||||
const event = events[0];
|
|
||||||
const allStaff = await base44.entities.Staff.list();
|
|
||||||
|
|
||||||
const recommendations = await base44.integrations.Core.InvokeLLM({
|
|
||||||
prompt: `Analyze this event (${event.event_name}) and recommend staff based on rating and reliability.`,
|
|
||||||
response_json_schema: {
|
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
recommendations: {
|
|
||||||
type: "array",
|
|
||||||
items: {
|
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
staff_id: { type: "string" },
|
|
||||||
score: { type: "number" },
|
|
||||||
reasoning: { type: "string" }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
return recommendations;
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 8. Best Practices
|
|
||||||
|
|
||||||
1. **Error Handling:** Always wrap calls in `try/catch` blocks and provide user-friendly error messages.
|
|
||||||
2. **Query Optimization:** Filter data at the database level (using `.filter()`) rather than fetching all records and filtering in memory.
|
|
||||||
3. **Pagination:** Use the `limit` and `offset` parameters in `.list()` for large datasets to improve performance.
|
|
||||||
4. **Caching:** Use libraries like React Query to cache SDK responses and reduce API calls.
|
|
||||||
5. **Batch Operations:** Prefer `bulkCreate` over looping through single create calls.
|
|
||||||
6. **Team Isolation:** Rely on the built-in `owner_id` filtering. Do not attempt to bypass cross-layer visibility rules (e.g., Vendors should not see Procurement teams).
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 9. Security Considerations
|
|
||||||
|
|
||||||
* **User Entity Access Control:** Built-in rules enforce that only admins can modify other users. Regular users are restricted to their own records.
|
|
||||||
* **Team Isolation:** Complete data isolation across organizational layers (Vendor, Procurement, Operator, Client) is enforced via `owner_id`.
|
|
||||||
* **Private Files:** Always use `UploadPrivateFile` for sensitive documents (W9, Insurance, etc.) and generate temporary signed URLs for access.
|
|
||||||
* **Input Validation:** While the API performs validation, always validate email formats and required fields on the client side before making requests.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 10. Rate Limits & Quotas
|
|
||||||
|
|
||||||
| Operation | Limit |
|
|
||||||
| :--- | :--- |
|
|
||||||
| **Entity Operations** | 1000 requests/minute |
|
|
||||||
| **LLM Invocations** | 100 requests/minute |
|
|
||||||
| **File Uploads** | 100 MB per file |
|
|
||||||
| **Email Sending** | 1000 emails/day |
|
|
||||||
|
|
||||||
*Tip: Implement exponential backoff for retries if you hit these limits.*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 11. Changelog
|
|
||||||
|
|
||||||
**Version 3.0 (2025-11-20)**
|
|
||||||
* Added Team, TeamMember, TeamHub, TeamMemberInvite entities.
|
|
||||||
* Added Assignment and Workforce entities.
|
|
||||||
* Enhanced Invoice entity with dispute tracking.
|
|
||||||
* Updated Event entity (Conflict detection, Multi-day).
|
|
||||||
|
|
||||||
**Version 2.0 (2025-01-11)**
|
|
||||||
* Complete entity schema documentation.
|
|
||||||
* Core integration specifications.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 12. Support & Resources
|
|
||||||
|
|
||||||
* **Platform Docs:** [https://docs.base44.com](https://docs.base44.com)
|
|
||||||
* **API Reference:** [https://api.base44.com/docs](https://api.base44.com/docs)
|
|
||||||
* **Technical Support:** support@krow.com
|
|
||||||
|
|
||||||
© 2025 KROW Workforce. All rights reserved.
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -1,55 +0,0 @@
|
|||||||
# Prompts for the Base44 AI
|
|
||||||
|
|
||||||
This file contains standardized prompts for use with the Base44 platform's artificial intelligence. The goal is to obtain precise and well-structured information about the "KROW Workforce" project's API and data schemas to facilitate the development of integrations and, if necessary, future migration.
|
|
||||||
|
|
||||||
## 1. Main Prompt: Request for Complete Documentation (Discreet and Iterative Version)
|
|
||||||
|
|
||||||
*Use this prompt after each new frontend export from Base44. It is phrased to avoid revealing the intention to migrate by requesting an update to an existing document.*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
```text
|
|
||||||
Hello!
|
|
||||||
|
|
||||||
I am the lead developer for the "KROW Workforce" project. We are looking to deeply integrate the Base44 platform into our wider ecosystem by building several custom tools and automations.
|
|
||||||
|
|
||||||
Previously, you generated an `API_documentation.md` file that was incredibly helpful for our team. To ensure our new integration projects are built using the most current project state, could you please refresh and exhaustively update this `API_documentation.md` file?
|
|
||||||
|
|
||||||
Please conduct a full review of our project and ensure the updated documentation includes the following for **all current entities** (Events, Staff, Vendors, Invoices, Users, Teams, etc.):
|
|
||||||
|
|
||||||
1. **Entity Data Schemas**: For each entity, a detailed breakdown of its fields in a table or JSON Schema format, including:
|
|
||||||
* `Field Name`
|
|
||||||
* `Data Type` (e.g., `string`, `number`, `timestamp`)
|
|
||||||
* `Description` of the field's purpose.
|
|
||||||
* `Validation Rules` (e.g., `required`, `unique`, `enum` values).
|
|
||||||
|
|
||||||
2. **SDK Operations**: A complete list of all available SDK methods for each entity (e.g., `.list()`, `.filter()`, `.create()`, `.update()`).
|
|
||||||
|
|
||||||
3. **Integrations API**: The full specification for all services under `base44.integrations.Core` (`SendEmail`, `InvokeLLM`, etc.), including their input parameters and output structure.
|
|
||||||
|
|
||||||
After you have updated the file, please output the **entire, updated content** of `API_documentation.md` in your response. This will serve as the single source of truth for our development team.
|
|
||||||
|
|
||||||
**IMPORTANT: Please ensure you output the ENTIRE, updated content of `API_documentation.md` directly in your response, not just a summary.**
|
|
||||||
|
|
||||||
Thank you!
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 2. Secondary Prompt: Request for a Specific Entity Schema
|
|
||||||
|
|
||||||
*Use this prompt if you need to quickly verify the structure of a single entity without requesting the full documentation.*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
```text
|
|
||||||
Hello!
|
|
||||||
|
|
||||||
For the "KROW Workforce" project, could you please provide the detailed data schema for the **[ENTITY_NAME]** entity?
|
|
||||||
|
|
||||||
Please format the response as a JSON Schema or a Markdown table, including the field names, data types, a description of each field, and any validation rules (like `required` fields or `enum` values).
|
|
||||||
|
|
||||||
For example, for an entity named `Event`.
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
@@ -1,77 +0,0 @@
|
|||||||
# SR&ED Project Documentation - KROW Platform
|
|
||||||
|
|
||||||
This document serves as the primary record for tracking Scientific Research and Experimental Development (SR&ED) activities for the KROW project. It translates our project plan into the language of technological uncertainty and systematic investigation, as required for SR&ED claims.
|
|
||||||
|
|
||||||
## Overall Technological Uncertainty
|
|
||||||
|
|
||||||
The core technological uncertainty of this project is whether a unified backend, built on the novel Firebase Data Connect service, can effectively and performantly serve a heterogeneous set of clients (a React web app and two Flutter mobile apps) while maintaining data integrity in a complex relational model (PostgreSQL). This involves overcoming challenges in schema management, SDK generation, and real-time data synchronization across platforms, for which no standard industry solution exists.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 1: Foundation & Dev Environment Setup
|
|
||||||
|
|
||||||
### 1.1. Technological Uncertainty
|
|
||||||
|
|
||||||
Can we establish a stable, multi-environment (dev, staging, prod) development workflow for a complex monorepo that integrates a declarative backend (Data Connect), a web frontend, and mobile frontends? The primary challenge is to create a reproducible setup that overcomes the limitations of local emulation and allows for parallel, collaborative development on a shared cloud infrastructure without conflicts.
|
|
||||||
|
|
||||||
### 1.2. Hypothesis
|
|
||||||
|
|
||||||
By combining a multi-environment `Makefile`, Firebase project aliases, and auto-generated, environment-aware SDKs, we hypothesize that we can create a streamlined and scalable development workflow. This approach should allow developers to seamlessly switch between cloud environments and ensure that all client applications (web and mobile) are always interacting with the correct backend instance.
|
|
||||||
|
|
||||||
### 1.3. Experimental Work
|
|
||||||
|
|
||||||
*(This section can be auto-populated by running `make export-issues` with the appropriate filters/labels.)*
|
|
||||||
|
|
||||||
- **`[Infra] Create Multi-Env Makefile`:** Development of a script to manage different cloud environments, which is a non-trivial engineering task involving environment variable injection and conditional logic.
|
|
||||||
- **`[Backend] Define GraphQL Schema & Deploy to Dev`:** Experimentation with the Data Connect schema-to-SQL generation process to validate its capabilities, performance with relational data, and limitations.
|
|
||||||
- **`[Web/Mobile] Generate & Integrate SDKs`:** Systematic investigation into the interoperability of the auto-generated SDKs with modern frontend frameworks (React/TanStack Query and Flutter/BLoC).
|
|
||||||
|
|
||||||
### 1.4. Results & Learnings
|
|
||||||
|
|
||||||
*(To be filled out upon milestone completion.)*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 2: Core Feature Implementation
|
|
||||||
|
|
||||||
### 2.1. Technological Uncertainty
|
|
||||||
|
|
||||||
Once the foundational architecture is in place, the next uncertainty is whether the declarative nature of Data Connect is powerful enough to handle the complex business logic required by the KROW platform. Can we implement features like multi-step event creation, real-time status updates, and complex data validation purely through GraphQL mutations and queries, without needing a separate, imperative logic layer (like traditional Cloud Functions)?
|
|
||||||
|
|
||||||
### 2.2. Hypothesis
|
|
||||||
|
|
||||||
We hypothesize that by leveraging advanced GraphQL features and the underlying power of PostgreSQL (accessible via Data Connect), we can encapsulate most, if not all, of the core business logic directly within our Data Connect backend. This would create a more maintainable and "self-documenting" system where the API definition itself contains the business rules.
|
|
||||||
|
|
||||||
### 2.3. Experimental Work
|
|
||||||
|
|
||||||
*(This section can be auto-populated by running `make export-issues` with the appropriate filters/labels.)*
|
|
||||||
|
|
||||||
- **`[Backend] Implement Full API Logic`:** This involves systematically testing the limits of Data Connect's mutation capabilities to handle transactional logic and data validation.
|
|
||||||
- **`[Web/Mobile] Full Application Re-wiring`:** This work will test the performance and ergonomics of the generated SDKs at scale, across dozens of components and screens.
|
|
||||||
|
|
||||||
### 2.4. Results & Learnings
|
|
||||||
|
|
||||||
*(To be filled out upon milestone completion.)*
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Milestone 3: Production Readiness & Go-Live
|
|
||||||
|
|
||||||
### 3.1. Technological Uncertainty
|
|
||||||
|
|
||||||
The final uncertainty is whether our automated, monorepo-based deployment strategy is robust and reliable enough for production. Can we create CI/CD pipelines that can correctly build, test, and deploy three distinct artifacts (Web, Mobile, Backend) in a coordinated manner, while managing environment-specific configurations and secrets securely?
|
|
||||||
|
|
||||||
### 3.2. Hypothesis
|
|
||||||
|
|
||||||
We hypothesize that by using a combination of GitHub Actions for workflow orchestration and CodeMagic for specialized Flutter builds, managed by our central `Makefile`, we can create a fully automated "push-to-deploy" system for all environments.
|
|
||||||
|
|
||||||
### 3.3. Experimental Work
|
|
||||||
|
|
||||||
*(This section can be auto-populated by running `make export-issues` with the appropriate filters/labels.)*
|
|
||||||
|
|
||||||
- **`[CI/CD] Configure Deployment Pipelines`:** This involves significant engineering work to script and test the automated build and deployment processes for each part of the monorepo.
|
|
||||||
- **`[Data] Create & Test Initial Data Import Scripts`:** Development of reliable and idempotent scripts to populate the production database.
|
|
||||||
|
|
||||||
### 3.4. Results & Learnings
|
|
||||||
|
|
||||||
*(To be filled out upon milestone completion.)*
|
|
||||||
@@ -1,25 +0,0 @@
|
|||||||
# Development Conventions
|
|
||||||
|
|
||||||
This document outlines the development conventions for the KROW project, including our GitHub label system.
|
|
||||||
|
|
||||||
## GitHub Labels
|
|
||||||
|
|
||||||
We use a structured system of labels to categorize and prioritize our work. The single source of truth for all available labels, their descriptions, and their colors is the `labels.yml` file at the root of this repository.
|
|
||||||
|
|
||||||
To apply these labels to the GitHub repository, run the following command:
|
|
||||||
```bash
|
|
||||||
make setup-labels
|
|
||||||
```
|
|
||||||
|
|
||||||
## GitHub Issue Template
|
|
||||||
|
|
||||||
To ensure consistency and capture all necessary information for both development and SR&ED tracking, we use a standardized issue template.
|
|
||||||
|
|
||||||
When creating a new issue on GitHub, select the **"SR&ED Task"** template. This will pre-populate the issue description with the following sections:
|
|
||||||
|
|
||||||
- **🎯 Objective:** A one-sentence summary of the goal.
|
|
||||||
- **🔬 SR&ED Justification:** A section to detail the technological uncertainty and the systematic investigation.
|
|
||||||
- **💻 Technical Implementation Notes:** A place for technical guidance for the developer.
|
|
||||||
- **✅ Acceptance Criteria:** A checklist to define what "done" means for this task.
|
|
||||||
|
|
||||||
Using this template is mandatory for all new development tasks.
|
|
||||||
@@ -1,121 +0,0 @@
|
|||||||
```mermaid
|
|
||||||
flowchart TD
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% INICIO DEL FLUJO
|
|
||||||
%% ================================
|
|
||||||
subgraph "App Initialization"
|
|
||||||
A[Vendor logs in or opens the app] --> B[Layout.jsx loads profile using base44 auth me and applies ProtectedRoute]
|
|
||||||
B --> C[VendorDashboard.jsx summarizes listEvents and listStaff and updates layout via auth updateMe]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% MAIN MENU
|
|
||||||
%% ================================
|
|
||||||
C --> M{Main Menu}
|
|
||||||
|
|
||||||
M --> OM[Orders]
|
|
||||||
M --> WF[Workforce]
|
|
||||||
M --> TM[Team and Hubs]
|
|
||||||
M --> FN[Finance]
|
|
||||||
M --> SC[Scheduling]
|
|
||||||
M --> CM[Communication and CRM]
|
|
||||||
M --> AN[Analytics and Auditing]
|
|
||||||
M --> TB[Task Board]
|
|
||||||
M --> FW[Financial Widgets]
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% ORDER MANAGEMENT
|
|
||||||
%% ================================
|
|
||||||
subgraph "Order Management"
|
|
||||||
OM --> O1[VendorOrders.jsx filters vendor events and uses ConflictDetection]
|
|
||||||
O1 --> O2{Are there orders?}
|
|
||||||
O2 -- Yes --> O3[EventDetail.jsx shows order details]
|
|
||||||
O3 --> O4[SmartAssignModal and SmartAssignmentEngine fill shifts and call updateEvent]
|
|
||||||
O1 --> O5[CreateInvoiceModal builds invoice roles from Event list and creates Invoice]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% WORKFORCE MANAGEMENT
|
|
||||||
%% ================================
|
|
||||||
subgraph "Workforce Management"
|
|
||||||
WF --> W1[StaffDirectory.jsx lists staff using Staff list]
|
|
||||||
W1 --> W2[AddStaff.jsx and EditStaff.jsx create or update Staff]
|
|
||||||
W1 --> W3[StaffOnboarding.jsx consolidates data and creates Staff]
|
|
||||||
W1 --> W4[StaffAvailability.jsx tries to use WorkerAvailability list]
|
|
||||||
W4 --> WX[Missing entity: WorkerAvailability not present in DataConnect schema]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% TEAM AND HUB MANAGEMENT
|
|
||||||
%% ================================
|
|
||||||
subgraph "Team and Hub Management"
|
|
||||||
TM --> T1[Teams.jsx manages internal team]
|
|
||||||
T1 --> T2[Invite Managers dialog]
|
|
||||||
T1 --> T3[Create or view hubs dialog]
|
|
||||||
T1 --> T4[Manage favorite or blocked staff]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% FINANCE MANAGEMENT
|
|
||||||
%% ================================
|
|
||||||
subgraph "Finance Management"
|
|
||||||
FN --> F1[VendorRates.jsx and VendorRateCard read and create VendorRate]
|
|
||||||
FN --> F2[SmartVendorOnboarding.jsx creates Vendor and VendorRate via base44 entities]
|
|
||||||
F2 --> F3[Vendor connector exposes list, get, filter, create, update and delete operations]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% SCHEDULING AND CALENDAR
|
|
||||||
%% ================================
|
|
||||||
subgraph "Scheduling"
|
|
||||||
SC --> S1[Schedule.jsx shows weekly shift calendar using listEvents]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% COMMUNICATION AND CRM
|
|
||||||
%% ================================
|
|
||||||
subgraph "Communication and CRM"
|
|
||||||
CM --> C1[Messages.jsx and MessageInput.jsx use Conversation and Message for list create and update]
|
|
||||||
CM --> C2[Business.jsx manages leads and clients using Business, Event and Invoice]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% ANALYTICS AND AUDITING
|
|
||||||
%% ================================
|
|
||||||
subgraph "Analytics and Auditing"
|
|
||||||
AN --> A1[Reports.jsx combines listEvents, listStaff and listInvoice]
|
|
||||||
AN --> A2[VendorPerformance.jsx shows performance metrics]
|
|
||||||
AN --> A3[ActivityLog.jsx filters ActivityLog for the user]
|
|
||||||
AN --> A4[VendorCompliance.jsx lists and captures Certification per staff member]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% TASK BOARD
|
|
||||||
%% ================================
|
|
||||||
subgraph "Task Board"
|
|
||||||
TB --> TSK[TaskBoard.jsx tries to use Task entity]
|
|
||||||
TSK --> TX[Missing entity: Task not present in schema]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% FINANCIAL WIDGETS (MOCK)
|
|
||||||
%% ================================
|
|
||||||
subgraph "Financial Widgets"
|
|
||||||
FW --> FW1[VendorInvoices.jsx uses mock data without real GraphQL]
|
|
||||||
FW --> FW2[VendorPerformance.jsx uses mock metrics without real GraphQL]
|
|
||||||
end
|
|
||||||
|
|
||||||
%% ================================
|
|
||||||
%% OPTIONAL STYLING
|
|
||||||
%% ================================
|
|
||||||
style A fill:#f4f4f5,stroke:#333,stroke-width:2px
|
|
||||||
style C fill:#e0f2fe,stroke:#0284c7,stroke-width:2px
|
|
||||||
style O1 fill:#f0fdf4,stroke:#16a34a,stroke-width:1px
|
|
||||||
style W1 fill:#fefce8,stroke:#ca8a04,stroke-width:1px
|
|
||||||
style T1 fill:#f5f3ff,stroke:#7c3aed,stroke-width:1px
|
|
||||||
style F1 fill:#fdf2f8,stroke:#db2777,stroke-width:1px
|
|
||||||
style WX fill:#fee2e2,stroke:#b91c1c,stroke-width:2px
|
|
||||||
style TX fill:#fee2e2,stroke:#b91c1c,stroke-width:2px
|
|
||||||
style FW1 fill:#fce7f3,stroke:#be185d,stroke-width:1px
|
|
||||||
style FW2 fill:#fce7f3,stroke:#be185d,stroke-width:1px
|
|
||||||
@@ -1,200 +0,0 @@
|
|||||||
|
|
||||||
# [Auth] Implement Firebase Authentication via krowSDK Facade
|
|
||||||
|
|
||||||
Labels: feature, infra, platform:web, platform:backend, priority:high, sred-eligible
|
|
||||||
Milestone: Foundation & Dev Environment Setup
|
|
||||||
|
|
||||||
### 🎯 Objective
|
|
||||||
|
|
||||||
Replace the Base44 authentication client with a new internal SDK module, `krowSDK.auth`, backed by Firebase Authentication. This foundational task will unblock all future backend development and align the web application with the new GCP-based architecture.
|
|
||||||
|
|
||||||
### 🔬 SR&ED Justification
|
|
||||||
|
|
||||||
- **Technological Uncertainty:** What is the optimal way to create a seamless abstraction layer (`krowSDK`) that perfectly mimics the existing `base44` SDK's interface to minimize frontend refactoring, while integrating a completely different authentication provider (Firebase Auth)? A key uncertainty is how this facade will handle the significant differences in user session management and data retrieval (e.g., custom user fields like `user_role` which are not native to the Firebase Auth user object) between the two systems.
|
|
||||||
- **Systematic Investigation:** We will conduct experimental development to build a `krowSDK.auth` module. This involves systematically mapping each `base44.auth` method (`me`, `logout`, `isAuthenticated`) to its Firebase Auth equivalent. We will investigate and prototype a solution for fetching supplementary user data (like `user_role`) from our Firestore database and merging it with the core Firebase Auth user object. This will establish a clean, reusable, and scalable architectural pattern for all future SDK modules.
|
|
||||||
|
|
||||||
### Details
|
|
||||||
|
|
||||||
This task is the most critical prerequisite for migrating our backend. As defined in the `03-backend-api-specification.md`, every request to our new Data Connect and Cloud Functions API will require a `Bearer <Firebase-Auth-Token>`. Without this, no backend work can proceed.
|
|
||||||
|
|
||||||
#### The Strategy: A Facade SDK (`krowSDK`)
|
|
||||||
|
|
||||||
Instead of replacing `base44` calls with Firebase calls directly throughout the codebase, we will create an abstraction layer, or "Facade". This approach has three major benefits:
|
|
||||||
|
|
||||||
1. **Simplified Migration:** The new `krowSDK.auth` will expose the *exact same methods* as the old `base44.auth`. This means we can swap out the authentication logic with minimal changes to the UI components, drastically reducing the scope of refactoring.
|
|
||||||
2. **High Maintainability:** All authentication logic will be centralized in one place. If we ever need to change providers again, we only modify the SDK, not the entire application.
|
|
||||||
3. **Clear Separation of Concerns:** The UI components remain agnostic about the authentication provider. They just need to call `krowSDK.auth.me()`, not worry about the underlying implementation details.
|
|
||||||
|
|
||||||
#### Implementation Plan
|
|
||||||
|
|
||||||
The developer should create two new files:
|
|
||||||
|
|
||||||
**1. Firebase Configuration (`frontend-web/src/firebase/config.js`)**
|
|
||||||
|
|
||||||
This file will initialize the Firebase app and export the necessary services. It's crucial to use environment variables for the configuration keys to keep them secure and environment-specific.
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
// frontend-web/src/firebase/config.js
|
|
||||||
|
|
||||||
import { initializeApp } from "firebase/app";
|
|
||||||
import { getAuth } from "firebase/auth";
|
|
||||||
import { getFirestore } from "firebase/firestore";
|
|
||||||
|
|
||||||
// Your web app's Firebase configuration
|
|
||||||
// IMPORTANT: Use environment variables for these values
|
|
||||||
const firebaseConfig = {
|
|
||||||
apiKey: import.meta.env.VITE_FIREBASE_API_KEY,
|
|
||||||
authDomain: import.meta.env.VITE_FIREBASE_AUTH_DOMAIN,
|
|
||||||
projectId: import.meta.env.VITE_FIREBASE_PROJECT_ID,
|
|
||||||
storageBucket: import.meta.env.VITE_FIREBASE_STORAGE_BUCKET,
|
|
||||||
messagingSenderId: import.meta.env.VITE_FIREBASE_MESSAGING_SENDER_ID,
|
|
||||||
appId: import.meta.env.VITE_FIREBASE_APP_ID
|
|
||||||
};
|
|
||||||
|
|
||||||
// Initialize Firebase
|
|
||||||
const app = initializeApp(firebaseConfig);
|
|
||||||
|
|
||||||
// Export Firebase services
|
|
||||||
export const auth = getAuth(app);
|
|
||||||
export const db = getFirestore(app);
|
|
||||||
|
|
||||||
export default app;
|
|
||||||
```
|
|
||||||
|
|
||||||
**2. Krow SDK (`frontend-web/src/lib/krowSDK.js`)**
|
|
||||||
|
|
||||||
This is the core of the task. This file will implement the facade, importing the Firebase `auth` service and recreating the `base44.auth` interface.
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
// frontend-web/src/lib/krowSDK.js
|
|
||||||
|
|
||||||
import { auth, db } from '../firebase/config';
|
|
||||||
import {
|
|
||||||
onAuthStateChanged,
|
|
||||||
signOut,
|
|
||||||
updateProfile,
|
|
||||||
// Import other necessary auth functions like signInWithEmailAndPassword, createUserWithEmailAndPassword, etc.
|
|
||||||
} from "firebase/auth";
|
|
||||||
import { doc, getDoc } from "firebase/firestore";
|
|
||||||
|
|
||||||
/**
|
|
||||||
* A promise-based wrapper for onAuthStateChanged to check the current auth state.
|
|
||||||
* @returns {Promise<boolean>} - A promise that resolves to true if authenticated, false otherwise.
|
|
||||||
*/
|
|
||||||
const isAuthenticated = () => {
|
|
||||||
return new Promise((resolve) => {
|
|
||||||
const unsubscribe = onAuthStateChanged(auth, (user) => {
|
|
||||||
unsubscribe();
|
|
||||||
resolve(!!user);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Fetches the current authenticated user's profile.
|
|
||||||
* This function mimics `base44.auth.me()` by combining the Firebase Auth user
|
|
||||||
* with custom data from our Firestore database (e.g., user_role).
|
|
||||||
* @returns {Promise<object|null>} - A promise that resolves with the user object or null.
|
|
||||||
*/
|
|
||||||
const me = () => {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
const unsubscribe = onAuthStateChanged(auth, async (user) => {
|
|
||||||
unsubscribe();
|
|
||||||
if (user) {
|
|
||||||
try {
|
|
||||||
// 1. Get the core user from Firebase Auth
|
|
||||||
const { uid, email, displayName } = user;
|
|
||||||
const baseProfile = {
|
|
||||||
id: uid,
|
|
||||||
email: email,
|
|
||||||
full_name: displayName,
|
|
||||||
};
|
|
||||||
|
|
||||||
// 2. Get custom fields from Firestore
|
|
||||||
// We assume a 'users' collection where the document ID is the user's UID.
|
|
||||||
const userDocRef = doc(db, "users", uid);
|
|
||||||
const userDoc = await getDoc(userDocRef);
|
|
||||||
|
|
||||||
if (userDoc.exists()) {
|
|
||||||
const customData = userDoc.data();
|
|
||||||
// 3. Merge the two data sources
|
|
||||||
resolve({
|
|
||||||
...baseProfile,
|
|
||||||
user_role: customData.user_role, // Example custom field
|
|
||||||
// ... other custom fields from Firestore
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
// User exists in Auth but not in Firestore. This can happen during sign-up.
|
|
||||||
// Resolve with base profile; the app should handle creating the Firestore doc.
|
|
||||||
resolve(baseProfile);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Error fetching user profile from Firestore:", error);
|
|
||||||
reject(error);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
// No user is signed in.
|
|
||||||
resolve(null);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Updates the current user's profile.
|
|
||||||
* This mimics `base44.auth.updateMe()`.
|
|
||||||
* @param {object} profileData - Data to update, e.g., { full_name: "New Name" }.
|
|
||||||
*/
|
|
||||||
const updateMe = async (profileData) => {
|
|
||||||
if (auth.currentUser) {
|
|
||||||
// Firebase Auth's updateProfile only supports displayName and photoURL.
|
|
||||||
if (profileData.full_name) {
|
|
||||||
await updateProfile(auth.currentUser, {
|
|
||||||
displayName: profileData.full_name,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
// For other custom fields, you would need to update the user's document in Firestore.
|
|
||||||
// const userDocRef = doc(db, "users", auth.currentUser.uid);
|
|
||||||
// await updateDoc(userDocRef, { custom_field: profileData.custom_field });
|
|
||||||
} else {
|
|
||||||
throw new Error("No authenticated user to update.");
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Logs the user out and redirects them.
|
|
||||||
* @param {string} [redirectUrl='/'] - The URL to redirect to after logout.
|
|
||||||
*/
|
|
||||||
const logout = (redirectUrl = '/') => {
|
|
||||||
signOut(auth).then(() => {
|
|
||||||
// Redirect after sign-out.
|
|
||||||
window.location.href = redirectUrl;
|
|
||||||
}).catch((error) => {
|
|
||||||
console.error("Logout failed:", error);
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
// The krowSDK object that mimics the Base44 SDK structure
|
|
||||||
export const krowSDK = {
|
|
||||||
auth: {
|
|
||||||
isAuthenticated,
|
|
||||||
me,
|
|
||||||
updateMe,
|
|
||||||
logout,
|
|
||||||
// Note: redirectToLogin is not implemented as it's a concern of the routing library (e.g., React Router),
|
|
||||||
// which should protect routes and redirect based on the authentication state.
|
|
||||||
},
|
|
||||||
// Future modules will be added here, e.g., krowSDK.entities.Event
|
|
||||||
};
|
|
||||||
```
|
|
||||||
|
|
||||||
### ✅ Acceptance Criteria
|
|
||||||
|
|
||||||
- [ ] A `frontend-web/src/firebase/config.js` file is created, correctly initializing the Firebase app using environment variables.
|
|
||||||
- [ ] A `frontend-web/src/lib/krowSDK.js` file is created and implements the `krowSDK.auth` facade.
|
|
||||||
- [ ] The `krowSDK.auth` module exports `isAuthenticated`, `me`, `updateMe`, and `logout` functions with interfaces identical to their `base44.auth` counterparts.
|
|
||||||
- [ ] Key parts of the application (e.g., login pages, user profile components, auth checks) are refactored to import and use `krowSDK.auth` instead of `base44.auth`.
|
|
||||||
- [ ] A new user can sign up using a Firebase-powered form.
|
|
||||||
- [ ] An existing user can log in using a Firebase-powered form.
|
|
||||||
- [ ] The application UI correctly updates to reflect the user's authenticated state (e.g., showing user name, hiding login button).
|
|
||||||
- [ ] After logging in, the user's Firebase Auth ID Token can be retrieved and is ready to be sent in an `Authorization: Bearer` header for API calls.
|
|
||||||
@@ -1,34 +0,0 @@
|
|||||||
Looking at the monorepo containing two separate Flutter applications in
|
|
||||||
- mobile-apps/client-app
|
|
||||||
- mobile-apps/staff-app
|
|
||||||
, and I want to configure Codemagic so both apps can be built and distributed to Firebase App Distribution.
|
|
||||||
|
|
||||||
Please do the following:
|
|
||||||
1. propose the best layout for Codemagic workflows.
|
|
||||||
2. Create three separate pipelines for each application:
|
|
||||||
* Development (dev)
|
|
||||||
* Staging (stage)
|
|
||||||
* Production (prod)
|
|
||||||
3. Each pipeline must:
|
|
||||||
* Build the correct Flutter app inside the monorepo.
|
|
||||||
* Use the correct Firebase App Distribution credentials for each environment.
|
|
||||||
* Push the built artifacts (Android + iOS if applicable) to the appropriate Firebase App Distribution app.
|
|
||||||
* Include environment-specific values (e.g., env variables, bundle IDs, keystore/certs, build flavors).
|
|
||||||
* Allow triggering pipelines manually.
|
|
||||||
4. Generate a complete codemagic.yaml example with:
|
|
||||||
* Separate workflows for:
|
|
||||||
* `client_app_dev`, `client_app_staging`, `client_app_prod`
|
|
||||||
* `client_app_dev`, `client_app_staging`, `client_app_prod`
|
|
||||||
* All required steps (install Flutter/Java/Xcode, pub get, build runner if needed, building APK/AAB/IPA, uploading to Firebase App Distribution, etc.).
|
|
||||||
* Example Firebase App IDs, release notes, tester groups, service accounts.
|
|
||||||
* Proper use of Codemagic encrypted variables.
|
|
||||||
* Best practices for monorepo path handling.
|
|
||||||
5. Add a short explanation of:
|
|
||||||
* How each pipeline works
|
|
||||||
* How to trigger builds
|
|
||||||
* How to update environment variables for Firebase
|
|
||||||
6. Output the final result as:
|
|
||||||
* A complete `codemagic.yaml`
|
|
||||||
* A brief guide on integrating it with a monorepo
|
|
||||||
* Notes on debugging, caching, and CI/CD optimization
|
|
||||||
|
|
||||||
@@ -1,80 +0,0 @@
|
|||||||
## What the prompt does
|
|
||||||
- Generate a full architecture diagram for a Flutter project based on the given overview and use case diagram
|
|
||||||
- The architecture diagram should include the frontend, backend, and database layers
|
|
||||||
- The architecture diagram should be generated using mermaid syntax
|
|
||||||
|
|
||||||
## Assumption
|
|
||||||
- Flutter project is given
|
|
||||||
- Overview mermaid diagram is given
|
|
||||||
- Use case diagram is given
|
|
||||||
- Backend architecture mermaid diagram is given
|
|
||||||
- The backend architecture diagram should be generated based on the given overview and use case diagrams
|
|
||||||
|
|
||||||
## How to use the prompt
|
|
||||||
For the given Flutter project, I want to generate a complete architecture document. Use the codebase together with the following Mermaid files:
|
|
||||||
|
|
||||||
* `overview.mermaid`
|
|
||||||
* `api_map.mermaid`
|
|
||||||
* `backend_architecture.mermaid`
|
|
||||||
* `use_case_flows.mermaid`
|
|
||||||
* `use-case-flowchart.mermaid` (duplicate file if needed for additional reference)
|
|
||||||
|
|
||||||
Your tasks:
|
|
||||||
|
|
||||||
1. Analyze the **Flutter project structure**, the relevant backend integrations, and all provided Mermaid diagrams.
|
|
||||||
|
|
||||||
2. Create a **comprehensive Markdown architecture document** (`architecture.md`) that includes:
|
|
||||||
|
|
||||||
### A. Introduction
|
|
||||||
|
|
||||||
* High-level summary of the project.
|
|
||||||
* Brief description of the core purpose of the app.
|
|
||||||
|
|
||||||
### B. Full Architecture Overview
|
|
||||||
|
|
||||||
* Explanation of app architecture used (e.g., layered architecture, MVVM, Clean Architecture, etc.).
|
|
||||||
* Description of key modules, layers, and responsibilities.
|
|
||||||
* Integration points between UI, domain, and data layers.
|
|
||||||
|
|
||||||
### C. Backend Architecture
|
|
||||||
|
|
||||||
* Based on the backend diagrams, describe:
|
|
||||||
|
|
||||||
* How GraphQL is used.
|
|
||||||
* How Firebase services (Auth, Firestore, Storage, Functions, etc.) are integrated.
|
|
||||||
* How the app communicates with the backend end-to-end.
|
|
||||||
* API flow between Flutter → GraphQL → Firebase.
|
|
||||||
|
|
||||||
### D. API Layer
|
|
||||||
|
|
||||||
* Summaries of GraphQL queries, mutations, and subscriptions.
|
|
||||||
* Explanation of how the app handles API errors, retries, caching, and parsing.
|
|
||||||
* Any backend-dependent logic highlighted in diagrams.
|
|
||||||
|
|
||||||
### E. State Management
|
|
||||||
|
|
||||||
* Identify the state management approach used (Bloc, Riverpod, Provider, Cubit, ValueNotifier, etc.).
|
|
||||||
* Explain:
|
|
||||||
|
|
||||||
* Why this method was chosen.
|
|
||||||
* How state flows between UI, logic, and backend.
|
|
||||||
* How the state management integrates with the API layer.
|
|
||||||
|
|
||||||
### F. Use-Case Flows
|
|
||||||
|
|
||||||
* Explain each major use case using the `use_case_flows.mermaid` and `use-case-flowchart.mermaid` diagrams.
|
|
||||||
* Describe the UI → Logic → Backend → Response cycle for each use case.
|
|
||||||
|
|
||||||
### G. Backend Replacement Section
|
|
||||||
|
|
||||||
Add a dedicated section titled **“Replacing or Plugging in a New Backend: Considerations & Recommendations”**, including:
|
|
||||||
|
|
||||||
* What parts of the codebase are tightly coupled to the current backend.
|
|
||||||
* What should be abstracted (e.g., repositories, services, DTOs, error handling).
|
|
||||||
* How to structure interfaces to allow backend swapping.
|
|
||||||
* Suggested design improvements to make the architecture more backend-agnostic.
|
|
||||||
* Migration strategies for replacing GraphQL + Firebase with another backend (REST, Supabase, Hasura, etc.).
|
|
||||||
|
|
||||||
3. Make the Markdown document clear, well-structured, and easy for developers to use as a long-term reference.
|
|
||||||
|
|
||||||
4. Output the final result as a **single `architecture.md` file**.
|
|
||||||
@@ -1,53 +0,0 @@
|
|||||||
## What the prompt does
|
|
||||||
This prompt generates a Mermaid diagram that visualizes the backend architecture of a Flutter project. It uses the given overview and use case diagrams to create a detailed diagram that shows the relationships between different components and services.
|
|
||||||
|
|
||||||
## Assumption
|
|
||||||
- Flutter project is given
|
|
||||||
- Overview mermaid diagram is given
|
|
||||||
- Use case diagram is given
|
|
||||||
|
|
||||||
## How to use the prompt
|
|
||||||
For the given Flutter project, the backend uses **GraphQL** and **Firebase**. I want multiple detailed Mermaid diagrams to understand how everything is connected.
|
|
||||||
|
|
||||||
Please do the following:
|
|
||||||
|
|
||||||
1. **Read and analyze** the entire project, along with these two files:
|
|
||||||
|
|
||||||
* `overview.mermaid`
|
|
||||||
* `use-case-flowchart.mermaid`
|
|
||||||
|
|
||||||
2. Based on all available information, generate **three separate Mermaid diagrams**:
|
|
||||||
|
|
||||||
### A. Backend Architecture Diagram
|
|
||||||
|
|
||||||
* Show the high-level structure of the backend.
|
|
||||||
* Include GraphQL server components, Firebase services (Auth, Firestore, Storage, Functions, etc.), and how the Flutter app connects to them.
|
|
||||||
* Show data flow between Flutter → GraphQL → Firebase → back to the client.
|
|
||||||
|
|
||||||
### B. API Map (GraphQL Operations + Firebase Interactions)
|
|
||||||
|
|
||||||
* List and group all GraphQL queries, mutations, and subscriptions.
|
|
||||||
* Show which ones interact with Firebase and how.
|
|
||||||
* If Firestore collections or documents are involved, show them as nodes.
|
|
||||||
* Clearly illustrate the relationship between API operations and backend resources.
|
|
||||||
|
|
||||||
### C. Use-Case Flow Diagrams
|
|
||||||
|
|
||||||
* For each major use case in the project:
|
|
||||||
|
|
||||||
* Show how the request moves from the Flutter UI to the backend.
|
|
||||||
* Show the sequence of steps involving GraphQL operations and Firebase services.
|
|
||||||
* Show how responses return back to the UI.
|
|
||||||
* Organize all use cases into **one combined Mermaid diagram** or **multiple subgraph clusters**.
|
|
||||||
|
|
||||||
3. Ensure all diagrams are:
|
|
||||||
|
|
||||||
* Clean, readable, and logically grouped
|
|
||||||
* Consistent with the structure of the existing project and the two Mermaid reference files
|
|
||||||
* Detailed enough for developers to understand backend behavior at a glance
|
|
||||||
|
|
||||||
4. Output the three diagrams clearly labeled as:
|
|
||||||
|
|
||||||
* **Backend Architecture**
|
|
||||||
* **API Map**
|
|
||||||
* **Use-Case Flows**
|
|
||||||
@@ -1,21 +0,0 @@
|
|||||||
## What the prompt does
|
|
||||||
This prompt generates a Mermaid diagram that provides an overview of the given Flutter application's architecture. It includes the main components of the app, such as the widget tree, state management, and navigation.
|
|
||||||
|
|
||||||
## Assumptions
|
|
||||||
- Flutter project is given.
|
|
||||||
|
|
||||||
## Prompt
|
|
||||||
In the given Flutter project. Carefully analyze all files to understand the app’s navigation, logic flow, state management, and how each screen connects to the next.
|
|
||||||
|
|
||||||
Using that analysis, generate a **Mermaid flowchart** that shows:
|
|
||||||
|
|
||||||
* The app’s entry point (main.dart → root widget).
|
|
||||||
* The initial flow (e.g., splash → login/signup → authenticated home).
|
|
||||||
* All pages/screens and how navigation occurs between them.
|
|
||||||
* Conditional routing logic (authentication checks, API responses, state changes, etc.).
|
|
||||||
* Any loops, flows, or background logic that impact navigation.
|
|
||||||
* The diagram must accurately reflect the project structure and not make assumptions beyond what is found in the code.
|
|
||||||
|
|
||||||
Use **Mermaid `flowchart TD` or `flowchart LR`** format, with clear labels for each screen and logic decision.
|
|
||||||
|
|
||||||
If needed, ask me for missing files so you can produce a complete diagram.
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
## What the prompt does
|
|
||||||
This prompt generates a Mermaid diagram that visualizes the use case of a Flutter application. The diagram includes the actors, use cases, and relationships between them.
|
|
||||||
|
|
||||||
## Assumption
|
|
||||||
- Flutter project is given
|
|
||||||
- Overview mermaid diagram is given
|
|
||||||
|
|
||||||
## How to use the prompt
|
|
||||||
Using the given Flutter project code — along with the previously generated Mermaid diagram that outlines the app’s navigation and logic flow — analyze the entire system to identify **all main use cases and their sub-use cases**.
|
|
||||||
|
|
||||||
From this analysis, create a **Mermaid flowchart** that shows:
|
|
||||||
|
|
||||||
* All high-level use cases the app supports (e.g., Authentication, Profile Management, Dashboard Interaction, Data Fetching, Settings, etc.).
|
|
||||||
* How each use case breaks down into sub-use cases (e.g., Authentication → Login, Signup, Logout, Token Refresh).
|
|
||||||
* How these use cases depend on each other or trigger one another.
|
|
||||||
* Connections between use cases as they appear in the actual code (UI actions, state changes, service calls, repository interactions, etc.).
|
|
||||||
* Any decision points or conditions that influence which sub-use case occurs next.
|
|
||||||
|
|
||||||
Format the output as a **clear Mermaid diagram** using `flowchart TD` or `flowchart LR`.
|
|
||||||
Ensure the flow represents the **real behavior of the code**, not assumptions.
|
|
||||||
|
|
||||||
Ask for any missing files if needed to build a complete and accurate use-case hierarchy.
|
|
||||||
Reference in New Issue
Block a user