feat(Makefile): restructure Makefile into modular files for better organization and maintainability

feat(Makefile): introduce common.mk for shared variables and environment configuration
feat(Makefile): create web.mk for web frontend-related tasks
feat(Makefile): create launchpad.mk for internal launchpad deployment tasks
feat(Makefile): create mobile.mk for mobile app development tasks
feat(Makefile): create dataconnect.mk for Data Connect management tasks
feat(Makefile): create tools.mk for development tools like git hooks
feat(Makefile): remove admin-web specific tasks and files as the admin console is no longer actively maintained
feat(Makefile): update help command to reflect the new modular structure and available commands
feat(Makefile): remove base44 export workflow as it is no longer relevant
feat(Makefile): remove IAP configuration as it is no longer used
feat(Makefile): remove harness-related tasks as they are no longer relevant

The Makefile has been significantly refactored to improve organization and maintainability. The changes include:

- Modularization: The monolithic Makefile has been split into smaller, more manageable files, each responsible for a specific area of the project (web, launchpad, mobile, dataconnect, tools).
- Common Configuration: Shared variables and environment configuration are now centralized in common.mk.
- Removal of Unused Tasks: Tasks related to the admin console, base44 export workflow, IAP configuration, and API test harness have been removed as they are no longer relevant.
- Updated Help Command: The help command has been updated to reflect the new modular structure and available commands.
- Improved Readability: The modular structure makes the Makefile easier to read and understand.
- Maintainability: The modular structure makes it easier to maintain and update the Makefile.
- Scalability: The modular structure makes it easier to add new tasks and features to the Makefile.
This commit is contained in:
bwnyasse
2026-01-10 20:59:18 -05:00
parent d0a536ffa4
commit b8d156b35a
47 changed files with 272 additions and 7124 deletions

View File

@@ -1,106 +0,0 @@
#!/usr/bin/env python3
import subprocess
import os
import re
import argparse
# --- Configuration ---
INPUT_FILE = "issues-to-create.md"
DEFAULT_PROJECT_TITLE = "Krow"
# ---
def create_issue(title, body, labels, milestone, project_title=None):
"""Creates a GitHub issue using the gh CLI."""
command = ["gh", "issue", "create"]
command.extend(["--title", title])
command.extend(["--body", body])
if project_title:
command.extend(["--project", project_title])
if milestone:
command.extend(["--milestone", milestone])
for label in labels:
command.extend(["--label", label])
print(f" -> Creating issue: \"{title}\"")
try:
result = subprocess.run(command, check=True, text=True, capture_output=True)
print(result.stdout.strip())
except subprocess.CalledProcessError as e:
print(f"❌ ERROR: Failed to create issue '{title}'.")
print(f" Stderr: {e.stderr.strip()}")
def main():
"""Main function to parse the file and create issues."""
parser = argparse.ArgumentParser(description="Bulk create GitHub issues from a markdown file.")
parser.add_argument("--project", default=DEFAULT_PROJECT_TITLE, help="GitHub Project title to add issues to.")
parser.add_argument("--no-project", action="store_true", help="Do not add issues to any project.")
args = parser.parse_args()
project_title = args.project if not args.no_project else None
print(f"🚀 Starting bulk creation of GitHub issues from '{INPUT_FILE}'...")
if project_title:
print(f" Target Project: '{project_title}'")
else:
print(" Target Project: (None)")
if subprocess.run(["which", "gh"], capture_output=True).returncode != 0:
print("❌ ERROR: GitHub CLI (gh) is not installed.")
exit(1)
if not os.path.exists(INPUT_FILE):
print(f"❌ ERROR: Input file {INPUT_FILE} not found.")
exit(1)
print("✅ Dependencies and input file found.")
print(f"2. Reading and parsing {INPUT_FILE}...")
with open(INPUT_FILE, 'r') as f:
content = f.read()
# Split the content by lines starting with '# '
issue_blocks = re.split(r'\n(?=#\s)', content)
for block in issue_blocks:
if not block.strip():
continue
lines = block.strip().split('\n')
title = lines[0].replace('# ', '').strip()
labels_line = ""
milestone_line = ""
body_start_index = 1
# Find all metadata lines (Labels, Milestone) at the beginning of the body
for i, line in enumerate(lines[1:]):
line_lower = line.strip().lower()
if line_lower.startswith('labels:'):
labels_line = line.split(':', 1)[1].strip()
elif line_lower.startswith('milestone:'):
milestone_line = line.split(':', 1)[1].strip()
elif line.strip() == "":
continue # Ignore blank lines in the metadata header
else:
# This is the first real line of the body
body_start_index = i + 1
break
body = "\n".join(lines[body_start_index:]).strip()
labels = [label.strip() for label in labels_line.split(',') if label.strip()]
milestone = milestone_line
if not title:
print("⚠️ Skipping block with no title.")
continue
create_issue(title, body, labels, milestone, project_title)
print("\n🎉 Bulk issue creation complete!")
if __name__ == "__main__":
main()

View File

@@ -1,97 +0,0 @@
#!/bin/bash
# ====================================================================================
# SCRIPT TO EXPORT GITHUB ISSUES TO A MARKDOWN FILE
# ====================================================================================
set -e # Exit script if a command fails
# --- Default Configuration ---
ISSUE_LIMIT=1000
DEFAULT_LABEL="sred-eligible"
DEFAULT_STATE="open"
# --- Parse Command Line Arguments ---
STATE=""
LABEL=""
# If no arguments are provided, run in legacy SR&ED mode
if [ $# -eq 0 ]; then
STATE=$DEFAULT_STATE
LABEL=$DEFAULT_LABEL
else
while [[ "$#" -gt 0 ]]; do
case $1 in
--state=*) STATE="${1#*=}" ;;
--state) STATE="$2"; shift ;;
--label=*) LABEL="${1#*=}" ;;
--label) LABEL="$2"; shift ;;
*) echo "Unknown parameter passed: $1"; exit 1 ;;
esac
shift
done
fi
# --- Dynamic Configuration ---
STATE_FOR_CMD=${STATE:-"open"} # Default to open if state is empty
LABEL_FOR_FILENAME=${LABEL:-"all"}
STATE_FOR_FILENAME=${STATE:-"open"}
OUTPUT_FILE="export-issues-${STATE_FOR_FILENAME}-${LABEL_FOR_FILENAME}.md"
TITLE="Export of GitHub Issues (State: ${STATE_FOR_FILENAME}, Label: ${LABEL_FOR_FILENAME})"
FETCH_MESSAGE="Fetching issues with state '${STATE_FOR_CMD}'"
if [ -n "$LABEL" ]; then
FETCH_MESSAGE="${FETCH_MESSAGE} and label '${LABEL}'"
fi
echo "🚀 Starting export of GitHub issues to '${OUTPUT_FILE}'..."
# --- Step 1: Dependency Check ---
echo "1. Checking for 'gh' CLI dependency..."
if ! command -v gh &> /dev/null; then
echo "❌ ERROR: GitHub CLI ('gh') is not installed. Please install it to continue."
exit 1
fi
echo "✅ 'gh' CLI found."
# --- Step 2: Initialize Output File ---
echo "# ${TITLE}" > "$OUTPUT_FILE"
echo "" >> "$OUTPUT_FILE"
echo "*Export generated on $(date)*." >> "$OUTPUT_FILE"
echo "" >> "$OUTPUT_FILE"
# --- Step 3: Build 'gh' command and Fetch Issues ---
echo "2. ${FETCH_MESSAGE}..."
GH_COMMAND=("gh" "issue" "list" "--state" "${STATE_FOR_CMD}" "--limit" "$ISSUE_LIMIT" "--json" "number")
if [ -n "$LABEL" ]; then
GH_COMMAND+=("--label" "${LABEL}")
fi
issue_numbers=$("${GH_COMMAND[@]}" | jq -r '.[].number')
if [ -z "$issue_numbers" ]; then
echo "⚠️ No issues found matching the criteria."
echo "" >> "$OUTPUT_FILE"
echo "**No issues found.**" >> "$OUTPUT_FILE"
exit 0
fi
total_issues=$(echo "$issue_numbers" | wc -l | xargs)
echo "✅ Found ${total_issues} issue(s)."
# --- Step 4: Loop Through Each Issue and Format the Output ---
echo "3. Formatting details for each issue..."
current_issue=0
for number in $issue_numbers; do
current_issue=$((current_issue + 1))
echo " -> Processing issue #${number} (${current_issue}/${total_issues})"
# Use 'gh issue view' with a template to format the output for each issue
gh issue view "$number" --json number,title,body,author,createdAt,state --template \
'\n### [#{{.number}}] {{.title}} ({{.state}})\n\n{{if .body}}{{.body}}{{else}}*No description provided.*{{end}}\n\n**Meta:** {{.author.login}} | **Created:** {{timefmt "2006-01-02" .createdAt}}\n***\n' >> "$OUTPUT_FILE"
done
echo ""
echo "🎉 Export complete!"
echo "Your markdown file is ready: ${OUTPUT_FILE}"

View File

@@ -1,100 +0,0 @@
const fs = require('fs');
const path = require('path');
const projectRoot = path.resolve(__dirname, '..');
const clientFilePath = path.join(projectRoot, 'frontend-web', 'src', 'api', 'base44Client.js');
const patchedMock = `// import { createClient } from '@base44/sdk';
// --- MIGRATION MOCK ---
// This mock completely disables the Base44 SDK to allow for local development.
// It also simulates user roles for the RoleSwitcher component.
const MOCK_USER_KEY = 'krow_mock_user_role';
// Default mock user data
const DEFAULT_MOCK_USER = {
id: "mock-user-123",
email: "dev@example.com",
full_name: "Dev User",
// 'role' is the Base44 default, 'user_role' is our custom field
role: "admin",
user_role: "admin", // Default role for testing
profile_picture: "https://i.pravatar.cc/150?u=a042581f4e29026024d",
};
// Function to get the current mock user state
const getMockUser = () => {
try {
const storedRole = localStorage.getItem(MOCK_USER_KEY);
if (storedRole) {
return { ...DEFAULT_MOCK_USER, user_role: storedRole, role: storedRole };
}
// If no role is stored, set the default and return it
localStorage.setItem(MOCK_USER_KEY, DEFAULT_MOCK_USER.user_role);
return DEFAULT_MOCK_USER;
} catch (e) {
// localStorage is not available (e.g., in SSR)
return DEFAULT_MOCK_USER;
}
};
export const base44 = {
auth: {
me: () => Promise.resolve(getMockUser()),
logout: () => {
try {
localStorage.removeItem(MOCK_USER_KEY); // Clear role on logout
// Optionally, redirect to login page or reload
window.location.reload();
} catch (e) {
// localStorage is not available
}
return Promise.resolve();
},
updateMe: (data) => {
try {
if (data.user_role) {
localStorage.setItem(MOCK_USER_KEY, data.user_role);
}
} catch (e) {
// localStorage is not available
}
// Simulate a successful update
return Promise.resolve({ ...getMockUser(), ...data });
},
},
entities: {
ActivityLog: {
filter: () => Promise.resolve([]),
},
// Add other entity mocks as needed for the RoleSwitcher to function
// For now, the RoleSwitcher only updates the user role, so other entities might not be critical.
},
integrations: {
Core: {
SendEmail: () => Promise.resolve({ status: "sent" }),
UploadFile: () => Promise.resolve({ file_url: "mock-file-url" }),
InvokeLLM: () => Promise.resolve({ result: "mock-ai-response" }),
// Add other integration mocks if the RoleSwitcher indirectly calls them
}
}
};`;
try {
const content = fs.readFileSync(clientFilePath, 'utf8');
// Check if the file is the original, unpatched version from the export
if (content.includes("createClient({")) {
fs.writeFileSync(clientFilePath, patchedMock, 'utf8');
console.log('✅ Successfully patched frontend-web/src/api/base44Client.js');
} else if (content.includes("const MOCK_USER_KEY")) {
console.log(' base44Client.js is already patched. Skipping.');
} else {
console.error('❌ Patching failed: Could not find the expected code in base44Client.js. The export format may have changed.');
process.exit(1);
}
} catch (error) {
console.error('❌ An error occurred during patching:', error);
process.exit(1);
}

View File

@@ -1,35 +0,0 @@
const fs = require('fs');
const path = require('path');
const projectRoot = path.resolve(__dirname, '..');
const dashboardFilePath = path.join(projectRoot, 'frontend-web', 'src', 'pages', 'Dashboard.jsx');
const oldString = ` <PageHeader
title="Welcome to KROW"`;
const newString = ` <PageHeader
title={
<div className="flex items-center gap-2">
<span>Welcome to KROW</span>
{import.meta.env.VITE_APP_ENV === 'staging' && <span className="bg-yellow-100 text-yellow-800 text-xs font-semibold px-2.5 py-0.5 rounded-full">Staging</span>}
{import.meta.env.VITE_APP_ENV === 'dev' && <span className="bg-slate-200 text-slate-800 text-xs font-semibold px-2.5 py-0.5 rounded-full">Dev</span>}
</div>
}`;
try {
const content = fs.readFileSync(dashboardFilePath, 'utf8');
if (content.includes(oldString)) {
const newContent = content.replace(oldString, newString);
fs.writeFileSync(dashboardFilePath, newContent, 'utf8');
console.log('✅ Successfully patched Dashboard.jsx to include environment label.');
} else if (content.includes('VITE_APP_ENV')) {
console.log(' Dashboard.jsx is already patched for environment labels. Skipping.');
} else {
console.error('❌ Patching Dashboard.jsx failed: Could not find the PageHeader title.');
process.exit(1);
}
} catch (error) {
console.error('❌ An error occurred during patching Dashboard.jsx:', error);
process.exit(1);
}

View File

@@ -1,41 +0,0 @@
const fs = require('fs');
const path = require('path');
const projectRoot = path.resolve(__dirname, '..');
const indexPath = path.join(projectRoot, 'frontend-web', 'index.html');
const oldTitle = '<title>Base44 APP</title>';
const newTitle = '<title>KROW</title>';
const oldFavicon = '<link rel="icon" type="image/svg+xml" href="https://base44.com/logo_v2.svg" />';
const newFavicon = '<link rel="icon" type="image/svg+xml" href="https://krow-workforce-dev-launchpad.web.app/favicon.svg" />';
try {
let content = fs.readFileSync(indexPath, 'utf8');
if (content.includes(oldTitle)) {
content = content.replace(oldTitle, newTitle);
console.log('✅ Successfully patched title in frontend-web/index.html.');
} else if (content.includes(newTitle)) {
console.log(' index.html title is already patched. Skipping.');
} else {
console.error('❌ Patching index.html failed: Could not find the expected title tag.');
process.exit(1);
}
if (content.includes(oldFavicon)) {
content = content.replace(oldFavicon, newFavicon);
console.log('✅ Successfully patched favicon in frontend-web/index.html.');
} else if (content.includes(newFavicon)) {
console.log(' index.html favicon is already patched. Skipping.');
} else {
console.error('❌ Patching index.html failed: Could not find the expected favicon link.');
process.exit(1);
}
fs.writeFileSync(indexPath, content, 'utf8');
} catch (error) {
console.error('❌ An error occurred during patching index.html:', error);
process.exit(1);
}

View File

@@ -1,26 +0,0 @@
const fs = require('fs');
const path = require('path');
const projectRoot = path.resolve(__dirname, '..');
const layoutFilePath = path.join(projectRoot, 'frontend-web', 'src', 'pages', 'Layout.jsx');
const oldString = ` queryKey: ['current-user-layout'],`;
const newString = ` queryKey: ['current-user'],`;
try {
const content = fs.readFileSync(layoutFilePath, 'utf8');
if (content.includes(oldString)) {
const newContent = content.replace(oldString, newString);
fs.writeFileSync(layoutFilePath, newContent, 'utf8');
console.log('✅ Successfully patched queryKey in frontend-web/src/pages/Layout.jsx');
} else if (content.includes(newString)) {
console.log(' queryKey in Layout.jsx is already patched. Skipping.');
} else {
console.error('❌ Patching failed: Could not find the expected queryKey in Layout.jsx.');
process.exit(1);
}
} catch (error) {
console.error('❌ An error occurred during patching Layout.jsx queryKey:', error);
process.exit(1);
}

View File

@@ -1,122 +0,0 @@
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
// Replicate __dirname functionality in ES modules
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const projectRoot = path.join(__dirname, '..', 'frontend-web');
// --- Patching Functions ---
function applyPatch(filePath, patchInfo) {
const fullPath = path.join(projectRoot, filePath);
if (!fs.existsSync(fullPath)) {
console.warn(`🟡 File not found, patch skipped: ${filePath}`);
return;
}
let content = fs.readFileSync(fullPath, 'utf8');
if (patchInfo.new_string && content.includes(patchInfo.new_string)) {
console.log(`✅ Patch already applied in ${filePath}.`);
return;
}
if (patchInfo.old_string && content.includes(patchInfo.old_string)) {
content = content.replace(patchInfo.old_string, patchInfo.new_string);
fs.writeFileSync(fullPath, content, 'utf8');
console.log(`🟢 Patch applied in ${filePath}.`);
} else {
console.error(`🔴 Could not apply patch in ${filePath}. String not found.`);
}
}
// --- Global Import Fix ---
function fixAllComponentImports(directory) {
const entries = fs.readdirSync(directory, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(directory, entry.name);
if (entry.isDirectory()) {
// Recursively search in subdirectories
fixAllComponentImports(fullPath);
} else if (entry.isFile() && (entry.name.endsWith('.jsx') || entry.name.endsWith('.js'))) {
let content = fs.readFileSync(fullPath, 'utf8');
const originalContent = content;
// Regex to find all relative imports to the components directory
// Handles: from "./components/", from "../components/", from "../../components/", etc.
const importRegex = /from\s+(['"])((\.\.\/)+|\.\/)components\//g;
content = content.replace(importRegex, 'from $1@/components/');
if (content !== originalContent) {
console.log(`✅ Fixing component imports in ${fullPath}`);
fs.writeFileSync(fullPath, content, 'utf8');
}
}
}
}
// --- Exécution ---
function main() {
console.log('--- Applying patches for local environment ---');
// The specific patches are now less critical as the global fix is more robust,
// but we can keep them for specific, non-import related changes.
const patches = [
{
file: 'src/main.jsx',
old_string: `ReactDOM.createRoot(document.getElementById('root')).render(
<App />
)`,
new_string: `import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
const queryClient = new QueryClient();
ReactDOM.createRoot(document.getElementById('root')).render(
<React.StrictMode>
<QueryClientProvider client={queryClient}>
<App />
</QueryClientProvider>
</React.StrictMode>,
)`
},
{
file: 'src/pages/Layout.jsx',
old_string: `const { data: user } = useQuery({
queryKey: ['current-user-layout'],
queryFn: () => base44.auth.me(),
});`,
new_string: ` // const { data: user } = useQuery({
// queryKey: ['current-user-layout'],
// queryFn: () => base44.auth.me(),
// });
// Mock user data to prevent redirection and allow local development
const user = {
full_name: "Dev User",
email: "dev@example.com",
user_role: "admin", // You can change this to 'procurement', 'operator', 'client', etc. to test different navigation menus
profile_picture: "https://i.pravatar.cc/150?u=a042581f4e29026024d",
};
`
}
];
patches.forEach(patchInfo => {
applyPatch(patchInfo.file, patchInfo);
});
console.log('--- Global component import fixes ---');
fixAllComponentImports(path.join(projectRoot, 'src'));
console.log('--- End of patching process ---');
}
main();

View File

@@ -1,72 +0,0 @@
#!/bin/bash
# ====================================================================================
# SCRIPT TO SETUP STANDARD GITHUB LABELS FROM A YAML FILE (v4 - Robust Bash Parsing)
# ====================================================================================
set -e # Exit script if a command fails
LABELS_FILE="labels.yml"
echo "🚀 Setting up GitHub labels from '${LABELS_FILE}'..."
# --- Function to create or edit a label ---
create_or_edit_label() {
NAME=$1
DESCRIPTION=$2
COLOR=$3
if [ -z "$NAME" ] || [ -z "$DESCRIPTION" ] || [ -z "$COLOR" ]; then
echo "⚠️ Skipping invalid label entry."
return
fi
# The `gh api` command will exit with a non-zero status if the label is not found (404).
# We redirect stderr to /dev/null to silence the expected "Not Found" error message.
if gh api "repos/{owner}/{repo}/labels/${NAME}" --silent 2>/dev/null; then
echo " - Editing existing label: '${NAME}'"
gh label edit "${NAME}" --description "${DESCRIPTION}" --color "${COLOR}"
else
echo " - Creating new label: '${NAME}'"
gh label create "${NAME}" --description "${DESCRIPTION}" --color "${COLOR}"
fi
}
# --- Read and Parse YAML File using a robust while loop ---
# This approach is more reliable than complex sed/awk pipelines.
name=""
description=""
color=""
while IFS= read -r line || [[ -n "$line" ]]; do
# Skip comments and empty lines
if [[ "$line" =~ ^\s*# ]] || [[ -z "$line" ]]; then
continue
fi
# Check for name
if [[ "$line" =~ -[[:space:]]+name:[[:space:]]+\"(.*)\" ]]; then
# If we find a new name, and the previous one was complete, process it.
if [ -n "$name" ] && [ -n "$description" ] && [ -n "$color" ]; then
create_or_edit_label "$name" "$description" "$color"
# Reset for the next entry
description=""
color=""
fi
name="${BASH_REMATCH[1]}"
# Check for description
elif [[ "$line" =~ [[:space:]]+description:[[:space:]]+\"(.*)\" ]]; then
description="${BASH_REMATCH[1]}"
# Check for color
elif [[ "$line" =~ [[:space:]]+color:[[:space:]]+\"(.*)\" ]]; then
color="${BASH_REMATCH[1]}"
fi
done < "$LABELS_FILE"
# Process the very last label in the file
if [ -n "$name" ] && [ -n "$description" ] && [ -n "$color" ]; then
create_or_edit_label "$name" "$description" "$color"
fi
echo ""
echo "🎉 All standard labels have been created or updated successfully."