5 Commits

Author SHA1 Message Date
c12387ac94 fix: t-005 a t-0029 correctif 2026-03-18 09:00:11 +01:00
bdb65a09ff fix: t-001 a t-005 correctif 2026-03-17 15:33:36 +01:00
13457ceb5a fix: readme
All checks were successful
Release / release (push) Successful in 26s
2026-03-17 14:26:49 +01:00
f30d75141d Merge pull request 'fix: resolve production runtime issues' (#19) from fix/prod-runtime-issues into develop
Some checks failed
Release / release (push) Has been cancelled
Reviewed-on: #19
2026-03-17 12:52:21 +00:00
8886e8b7df fix: resolve production runtime issues 2026-03-17 13:50:13 +01:00
23 changed files with 371 additions and 105 deletions

View File

@@ -10,14 +10,16 @@ BACKUPS_REMOTE_HOST=
BACKUPS_REMOTE_ROOT= BACKUPS_REMOTE_ROOT=
BACKUPS_MAX_FILES= BACKUPS_MAX_FILES=
# DISK_COMMAND_REMOTE et DISK_COMMAND_LOCAL pour les commandes de vérification de l'espace disque # Paramètres utilisés pour construire les commandes disque et backup
DISK_COMMAND_REMOTE= DISK_REMOTE_HOST=malio-b
DISK_COMMAND_LOCAL= DISK_LOCAL_SCRIPT_DIR=/home/malio/Malio-ops/CheckStorage
DISK_REMOTE_SCRIPT_DIR=/home/malio-b/Malio-ops/CheckStorage
# BACKUP_SCRIPT_COMMAND_BACKUP_BDD_RECETTE, BACKUP_SCRIPT_COMMAND_CHECK_STATUT_RECETTE et BACKUP_SCRIPT_COMMAND_BACKUP_VAULTWARDEN pour les commandes de backup et de vérification des statuts RECETTE_SCRIPTS_DIR=/home/malio/Malio-ops/RecetteScripts
BACKUP_SCRIPT_COMMAND_BACKUP_BDD_RECETTE= VAULTWARDEN_SSH_HOST=bitwarden
BACKUP_SCRIPT_COMMAND_CHECK_STATUT_RECETTE= VAULTWARDEN_SCRIPTS_DIR=/home/matt/vaultwarden/Malio-ops/BackupVaultWarden
BACKUP_SCRIPT_COMMAND_BACKUP_VAULTWARDEN=
# A quelle heure les backups doivent être effectués (format 24h) # A quelle heure les backups doivent être effectués (format 24h)
BACKUPS_HOUR=19 BACKUPS_HOUR=19
#Mettre à true pour que les cookies d'authentification soient sécurisés (HTTPS uniquement)
AUTH_COOKIE_SECURE=

View File

@@ -95,6 +95,19 @@ Consequence visible :
- si `API_SECRET_KEY` est vide, les appels API sont refusés avec `401 Unauthorized` - si `API_SECRET_KEY` est vide, les appels API sont refusés avec `401 Unauthorized`
- l'application web pose aussi un cookie HTTP-only via `server/middleware/auth-cookie.ts` pour réutiliser ce secret coté navigateur - l'application web pose aussi un cookie HTTP-only via `server/middleware/auth-cookie.ts` pour réutiliser ce secret coté navigateur
## Securite
Le comportement actuel du projet repose sur une hypothèse d'exposition très forte.
- `server/middleware/auth-cookie.ts` pose automatiquement le cookie `api_auth_token` à tout visiteur qui charge l'interface web
- ce cookie permet ensuite d'accéder aux routes `/api/*` protégées par `API_SECRET_KEY`
- il n'existe pas de login utilisateur ni de contrôle d'identité distinct dans le dépôt
Conséquence :
- `Supervisor` doit être déployé uniquement sur un réseau de confiance, derrière un VPN, une restriction d'IP, un proxy d'authentification ou un autre contrôle d'accès externe
- si l'application est exposée publiquement sans protection supplémentaire, ce mécanisme ne constitue pas une authentification suffisante
### SSH pour les backups ### SSH pour les backups
Les fonctionnalités de backup utilisent `ssh` avec les options `BatchMode=yes` et `ConnectTimeout=5` dans `server/utils/ssh.ts`. Cela implique un accès sans saisie interactive de mot de passe. Les fonctionnalités de backup utilisent `ssh` avec les options `BatchMode=yes` et `ConnectTimeout=5` dans `server/utils/ssh.ts`. Cela implique un accès sans saisie interactive de mot de passe.
@@ -126,4 +139,4 @@ Usage :
- `npm run generate` : généré une sortie statique si ce mode est compatible avec votre usage - `npm run generate` : généré une sortie statique si ce mode est compatible avec votre usage
- `npm run preview` : prévisualisé le build Nuxt - `npm run preview` : prévisualisé le build Nuxt
- `npm run lint` : execute ESLint - `npm run lint` : execute ESLint
- `npm run lint:fix` : applique les corrections ESLint automatiques : collecte périodique CPU, mémoire et réseau - `npm run lint:fix` : applique les corrections ESLint automatiques : collecte périodique CPU, mémoire et réseau

View File

@@ -15,7 +15,6 @@
--color-m-success: rgb(var(--m-success)); --color-m-success: rgb(var(--m-success));
--color-m-accent: rgb(var(--m-accent)); --color-m-accent: rgb(var(--m-accent));
--color-m-warning: rgb(var(--m-warning)); --color-m-warning: rgb(var(--m-warning));
--color-m-succes: rgb(var(--m-success));
--font-display: "Outfit", system-ui, sans-serif; --font-display: "Outfit", system-ui, sans-serif;
--font-mono: "JetBrains Mono", "Fira Code", monospace; --font-mono: "JetBrains Mono", "Fira Code", monospace;
} }

View File

@@ -77,7 +77,6 @@
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { computed, onMounted, ref } from "vue"
import { Icon as IconifyIcon } from "@iconify/vue" import { Icon as IconifyIcon } from "@iconify/vue"
import { apiFetch } from "~/composables/useApiAuth" import { apiFetch } from "~/composables/useApiAuth"
@@ -116,7 +115,6 @@ const active = ref<string | null>(null)
const loading = ref(true) const loading = ref(true)
const runningKey = ref<string | null>(null) const runningKey = ref<string | null>(null)
const scripts = ref<BackupScript[]>([]) const scripts = ref<BackupScript[]>([])
const output = ref<string>("")
const message = ref<string>("") const message = ref<string>("")
const isError = ref(false) const isError = ref(false)
@@ -125,7 +123,6 @@ const statusClass = computed(() => (isError.value ? "status-error" : "status-suc
const loadScripts = async () => { const loadScripts = async () => {
loading.value = true loading.value = true
message.value = "" message.value = ""
output.value = ""
isError.value = false isError.value = false
emit("result", { emit("result", {
key: null, key: null,
@@ -156,7 +153,6 @@ const loadScripts = async () => {
const runScript = async (key: string) => { const runScript = async (key: string) => {
active.value = key active.value = key
runningKey.value = key runningKey.value = key
output.value = ""
message.value = "" message.value = ""
isError.value = false isError.value = false
@@ -165,30 +161,17 @@ const runScript = async (key: string) => {
method: "POST", method: "POST",
body: { key } body: { key }
}) })
const resultOutput = data.output || "Aucune sortie retournee."
message.value = `${data.label} execute avec succes` message.value = `${data.label} execute avec succes`
output.value = data.output || "Aucune sortie retournee."
emit("result", { emit("result", {
key: data.key, key: data.key,
label: data.label, label: data.label,
output: output.value, output: resultOutput,
isError: false, isError: false,
downloadFolders: data.downloadFolders || [] downloadFolders: data.downloadFolders || []
}) })
} catch (error: unknown) { } catch (error: unknown) {
isError.value = true message.value = (error as any)?.data?.statusMessage || "Erreur execution script"
const statusMessage =
typeof error === "object" &&
error !== null &&
"data" in error &&
typeof error.data === "object" &&
error.data !== null &&
"statusMessage" in error.data &&
typeof error.data.statusMessage === "string"
? error.data.statusMessage
: null
message.value = statusMessage || "Erreur execution script"
output.value = ""
emit("result", { emit("result", {
key, key,
label: scripts.value.find((item) => item.key === key)?.label || key, label: scripts.value.find((item) => item.key === key)?.label || key,

View File

@@ -293,7 +293,7 @@ const visibleHistory = computed(() => {
return history.value.filter((point) => point.sampledAt >= minTimestamp) return history.value.filter((point) => point.sampledAt >= minTimestamp)
}) })
const scaleMax = computed(() => 100) const scaleMax = 100
const formatValue = (value: number) => `${Math.round(value)}%` const formatValue = (value: number) => `${Math.round(value)}%`

View File

@@ -42,7 +42,6 @@ export function useApiAuthHeader() {
// Tous les appels frontend vers /api/* reutilisent ce header commun. // Tous les appels frontend vers /api/* reutilisent ce header commun.
return { return {
Authorization: `Bearer ${token}`
} }
} }

View File

@@ -146,7 +146,6 @@
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import {ref} from "vue"
import {Icon as IconifyIcon} from "@iconify/vue" import {Icon as IconifyIcon} from "@iconify/vue"
import logoSrc from '~/assets/LOGO_CARRE_BLANC.png' import logoSrc from '~/assets/LOGO_CARRE_BLANC.png'

View File

@@ -4,12 +4,12 @@ import tailwindcss from "@tailwindcss/vite"
const getRepoVersion = () => { const getRepoVersion = () => {
try { try {
const tags = execSync( const tags = execSync(
"git for-each-ref --sort=-version:refname --format='%(refname:short)' refs/tags", "git for-each-ref --sort=-version:refname --format='%(refname:short)' refs/tags",
{ encoding: "utf8" } { encoding: "utf8" }
) )
.split("\n") .split("\n")
.map((tag) => tag.trim()) .map((tag) => tag.trim())
.filter(Boolean) .filter(Boolean)
return tags[0] || "dev" return tags[0] || "dev"
} catch { } catch {
@@ -26,7 +26,7 @@ export default defineNuxtConfig({
head: { head: {
link: [ link: [
{ rel: "preconnect", href: "https://fonts.googleapis.com" }, { rel: "preconnect", href: "https://fonts.googleapis.com" },
{ rel: "preconnect", href: "https://fonts.gstatic.com ", crossorigin: "" }, { rel: "preconnect", href: "https://fonts.gstatic.com", crossorigin: "" },
{ {
rel: "stylesheet", rel: "stylesheet",
href: "https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@400;500;600;700&family=Outfit:wght@300;400;500;600;700;800;900&display=swap" href: "https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@400;500;600;700&family=Outfit:wght@300;400;500;600;700;800;900&display=swap"
@@ -36,8 +36,11 @@ export default defineNuxtConfig({
}, },
runtimeConfig: { runtimeConfig: {
apiSecretKey: process.env.API_SECRET_KEY, apiSecretKey: process.env.API_SECRET_KEY,
discordBotToken: process.env.DISCORD_BOT_TOKEN,
discordChannelId: process.env.DISCORD_CHANNEL_ID,
public: { public: {
appVersion: getRepoVersion() appVersion: getRepoVersion(),
apiKey: process.env.API_SECRET_KEY
} }
}, },
vite: { vite: {

8
package-lock.json generated
View File

@@ -1,10 +1,10 @@
{ {
"name": "disk-monitor", "name": "supervisor",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "disk-monitor", "name": "supervisor",
"hasInstallScript": true, "hasInstallScript": true,
"dependencies": { "dependencies": {
"@iconify/vue": "^5.0.0", "@iconify/vue": "^5.0.0",
@@ -15,11 +15,13 @@
"@semantic-release/changelog": "^6.0.3", "@semantic-release/changelog": "^6.0.3",
"@semantic-release/commit-analyzer": "^13.0.1", "@semantic-release/commit-analyzer": "^13.0.1",
"@semantic-release/git": "^10.0.1", "@semantic-release/git": "^10.0.1",
"@semantic-release/github": "^12.0.6",
"@semantic-release/release-notes-generator": "^14.1.0", "@semantic-release/release-notes-generator": "^14.1.0",
"@tailwindcss/vite": "^4.2.1", "@tailwindcss/vite": "^4.2.1",
"semantic-release": "^25.0.3", "semantic-release": "^25.0.3",
"tailwindcss": "^4.2.1" "tailwindcss": "^4.2.1"
},
"engines": {
"node": ">=20"
} }
}, },
"node_modules/@actions/core": { "node_modules/@actions/core": {

View File

@@ -1,7 +1,10 @@
{ {
"name": "disk-monitor", "name": "supervisor",
"type": "module", "type": "module",
"private": true, "private": true,
"engines": {
"node": ">=20"
},
"scripts": { "scripts": {
"build": "nuxt build", "build": "nuxt build",
"dev": "nuxt dev", "dev": "nuxt dev",
@@ -20,7 +23,6 @@
"@semantic-release/changelog": "^6.0.3", "@semantic-release/changelog": "^6.0.3",
"@semantic-release/commit-analyzer": "^13.0.1", "@semantic-release/commit-analyzer": "^13.0.1",
"@semantic-release/git": "^10.0.1", "@semantic-release/git": "^10.0.1",
"@semantic-release/github": "^12.0.6",
"@semantic-release/release-notes-generator": "^14.1.0", "@semantic-release/release-notes-generator": "^14.1.0",
"@tailwindcss/vite": "^4.2.1", "@tailwindcss/vite": "^4.2.1",
"semantic-release": "^25.0.3", "semantic-release": "^25.0.3",

View File

@@ -66,7 +66,6 @@
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import {computed, onMounted, ref} from "vue"
import { apiFetch } from "~/composables/useApiAuth" import { apiFetch } from "~/composables/useApiAuth"
import type { SystemMetrics } from "~/types/system"; import type { SystemMetrics } from "~/types/system";
@@ -334,14 +333,6 @@ onBeforeUnmount(() => {
grid-template-columns: 1fr; grid-template-columns: 1fr;
} }
.backup-selector {
order: 2;
}
.backup-list-mobile {
order: 3;
}
.speedtest-card-mobile { .speedtest-card-mobile {
order: 4; order: 4;
} }

View File

@@ -1,2 +1,2 @@
User-Agent: * User-Agent: *
Disallow: Disallow: /

View File

@@ -3,12 +3,10 @@ import {
shellQuote, shellQuote,
resolveFolderRemoteDir, resolveFolderRemoteDir,
REMOTE_ROOT, REMOTE_ROOT,
isSafeFolder,
} from "../utils/ssh.ts" } from "../utils/ssh.ts"
import {process} from "std-env"; const MAX_FILES_PER_FOLDER = Math.max(1, Number(process.env.BACKUPS_MAX_FILES) || 50)
const MAX_FILES_PER_FOLDER = Number(process.env.BACKUPS_MAX_FILES)
const isSafeFolder = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
function isMissingPathError(error: unknown): boolean { function isMissingPathError(error: unknown): boolean {

View File

@@ -1,6 +1,7 @@
export default defineEventHandler(async () => { export default defineEventHandler(async (event) => {
const token = process.env.DISCORD_BOT_TOKEN const config = useRuntimeConfig(event)
const channel = process.env.DISCORD_CHANNEL_ID const token = config.discordBotToken
const channel = config.discordChannelId
if (!token || !channel) { if (!token || !channel) {
throw createError({ throw createError({

View File

@@ -1,38 +1,49 @@
import { exec } from "child_process" import { execFile } from "node:child_process"
type DiskSource = { type DiskSource = {
key: string key: "remote" | "local"
label: string label: string
}
type CommandSpec = {
command: string command: string
args?: string[] args: string[]
cwd?: string
} }
const diskSources: DiskSource[] = [ const diskSources: DiskSource[] = [
{ {
key: "remote", key: "remote",
label: "Serveur distant", label: "Serveur distant"
command: "ssh",
args: []
}, },
{ {
key: "local", key: "local",
label: "Machine locale", label: "Machine locale"
command: "bash",
args: []
} }
] ]
function getEnvCommand(source: DiskSource) { function getCommand(source: DiskSource): CommandSpec {
const envKey = `DISK_COMMAND_${source.key.toUpperCase()}` const localScriptDir = process.env.DISK_LOCAL_SCRIPT_DIR || "/home/malio/Malio-ops/CheckStorage"
const legacyEnvKey = const remoteHost = process.env.DISK_REMOTE_HOST || "malio-b"
source.key === "remote" ? "DISK_REMOTE_COMMAND" : source.key === "local" ? "DISK_LOCAL_COMMAND" : "" const remoteScriptDir = process.env.DISK_REMOTE_SCRIPT_DIR || "/home/malio-b/Malio-ops/CheckStorage"
return process.env[envKey] || (legacyEnvKey ? process.env[legacyEnvKey] : undefined) || null if (source.key === "local") {
return {
command: "bash",
args: ["check-storage.sh"],
cwd: localScriptDir
}
}
return {
command: "ssh",
args: [remoteHost, `cd ${remoteScriptDir} && ./check-storage.sh`]
}
} }
function runShellCommand(command: string): Promise<string> { function runCommand({ command, args, cwd }: CommandSpec): Promise<string> {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
exec(command, (error, stdout, stderr) => { execFile(command, args, { cwd }, (error, stdout, stderr) => {
if (error) { if (error) {
reject(stderr || error.message) reject(stderr || error.message)
return return
@@ -46,12 +57,7 @@ export default defineEventHandler(async () => {
const results = await Promise.all( const results = await Promise.all(
diskSources.map(async (source) => { diskSources.map(async (source) => {
try { try {
const envCommand = getEnvCommand(source) const output = await runCommand(getCommand(source))
if (!envCommand) {
throw new Error(`Commande disque manquante pour ${source.key}`)
}
const output = await runShellCommand(envCommand)
return { return {
key: source.key, key: source.key,
label: source.label, label: source.label,

View File

@@ -3,11 +3,9 @@ import {
shellQuote, shellQuote,
resolveFolderRemoteDir, resolveFolderRemoteDir,
REMOTE_HOST, REMOTE_HOST,
isSafeFolder
} from "../utils/ssh.ts" } from "../utils/ssh.ts"
import { spawn } from "node:child_process"
import {spawn} from "unenv/node/child_process";
const isSafeFolder = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
async function getLatestRemoteFile(remoteDir: string): Promise<string | null> { async function getLatestRemoteFile(remoteDir: string): Promise<string | null> {
const output = await runSsh(`cd ${shellQuote(remoteDir)} && ls -1A | sort -r | head -n 1`) const output = await runSsh(`cd ${shellQuote(remoteDir)} && ls -1A | sort -r | head -n 1`)
@@ -46,6 +44,9 @@ export default defineEventHandler(async (event) => {
} }
const fileName = await getLatestRemoteFile(remoteDir) const fileName = await getLatestRemoteFile(remoteDir)
if (!fileName || !isSafeFolder(fileName)) {
continue
}
if (!fileName) { if (!fileName) {
continue continue
} }
@@ -94,6 +95,6 @@ export default defineEventHandler(async (event) => {
console.error(`Erreur archive SSH (${code}): ${stderr}`) console.error(`Erreur archive SSH (${code}): ${stderr}`)
} }
}) })
event.node.res.on("close", () => child.kill())
return sendStream(event, child.stdout) return sendStream(event, child.stdout)
}) })

View File

@@ -3,12 +3,10 @@ import {
shellQuote, shellQuote,
resolveFolderRemoteDir, resolveFolderRemoteDir,
REMOTE_HOST, REMOTE_HOST,
isSafeFolder,
isSafeFile
} from "../utils/ssh.ts" } from "../utils/ssh.ts"
import {spawn} from "unenv/node/child_process"; import { spawn } from "node:child_process"
const isSafeFolder = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
const isSafeFile = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
function buildContentDisposition(fileName: string) { function buildContentDisposition(fileName: string) {
const asciiName = fileName.replace(/[^\x20-\x7E]/g, "_").replace(/["\\]/g, "_") const asciiName = fileName.replace(/[^\x20-\x7E]/g, "_").replace(/["\\]/g, "_")
@@ -61,6 +59,6 @@ export default defineEventHandler(async (event) => {
console.error(`Erreur téléchargement SSH (${code}): ${stderr}`) console.error(`Erreur téléchargement SSH (${code}): ${stderr}`)
} }
}) })
event.node.res.on("close", () => child.kill())
return sendStream(event, child.stdout) return sendStream(event, child.stdout)
}) })

View File

@@ -1,9 +1,10 @@
export default defineEventHandler(async (event) => { export default defineEventHandler(async (event) => {
const req = event.node.req const req = event.node.req
const MAX_UPLOAD_BYTES = 100 * 1024 * 1024 // 100MB
let received = 0 let received = 0
for await (const chunk of req) { for await (const chunk of req) {
if (received > MAX_UPLOAD_BYTES) throw createError({ statusCode: 413, statusMessage: "Fichier trop volumineux" })
received += chunk.length received += chunk.length
} }

View File

@@ -1,12 +1,18 @@
import targets from "../config/version-status-targets.json" import targets from "../config/version-status-targets.json"
const REQUEST_TIMEOUT_MS = 5000
export default defineEventHandler(async () => { export default defineEventHandler(async () => {
const results = await Promise.all( const results = await Promise.all(
targets.map(async (target) => { targets.map(async (target) => {
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), REQUEST_TIMEOUT_MS)
try { try {
const response = await fetch(target.url, { const response = await fetch(target.url, {
method: "GET", method: "GET",
headers: { Accept: "application/json" } headers: { Accept: "application/json" },
signal: controller.signal
}) })
return { return {
@@ -25,6 +31,8 @@ export default defineEventHandler(async () => {
checkedAt: new Date().toISOString(), checkedAt: new Date().toISOString(),
error: error instanceof Error ? error.message : String(error) error: error instanceof Error ? error.message : String(error)
} }
} finally {
clearTimeout(timeoutId)
} }
}) })
) )

View File

@@ -1,3 +1,9 @@
// SECURITE:
// Ce middleware pose automatiquement le cookie d'authentification pour tout
// visiteur de l'interface web. Ce comportement repose sur l'hypothèse que
// Supervisor n'est exposé qu'à un réseau de confiance ou derrière un contrôle
// d'accès externe. Si l'application devient publiquement accessible, ce
// mécanisme ne constitue pas une authentification utilisateur.
export default defineEventHandler((event) => { export default defineEventHandler((event) => {
const path = event.path || event.node.req.url || "" const path = event.path || event.node.req.url || ""
@@ -16,10 +22,12 @@ export default defineEventHandler((event) => {
return return
} }
const secureCookie = process.env.AUTH_COOKIE_SECURE === "true"
setCookie(event, "api_auth_token", expectedToken, { setCookie(event, "api_auth_token", expectedToken, {
httpOnly: true, httpOnly: true,
sameSite: "lax", sameSite: "lax",
secure: process.env.NODE_ENV === "production", secure: secureCookie,
path: "/" path: "/"
}) })
}) })

View File

@@ -25,7 +25,21 @@ export const backupScripts: BackupScript[] = [
} }
] ]
const getDefaultBackupScriptCommands = (): Record<string, string> => {
const recetteScriptsDir = process.env.RECETTE_SCRIPTS_DIR || "/home/malio/Malio-ops/RecetteScripts"
const vaultwardenHost = process.env.VAULTWARDEN_SSH_HOST || "bitwarden"
const vaultwardenScriptsDir =
process.env.VAULTWARDEN_SCRIPTS_DIR || "/home/matt/vaultwarden/Malio-ops/BackupVaultWarden"
return {
"backup-bdd-recette": `cd ${recetteScriptsDir} && bash backup-bdd-recette.sh`,
"check-statut-recette": `cd ${recetteScriptsDir} && bash check-statut-recette.sh`,
"backup-vaultwarden":
`ssh ${vaultwardenHost} "cd ${vaultwardenScriptsDir} && bash backup-vaultwarden.sh"`
}
}
export function getBackupScriptCommand(key: string) { export function getBackupScriptCommand(key: string) {
const envKey = `BACKUP_SCRIPT_COMMAND_${key.toUpperCase().replace(/-/g, "_")}` const envKey = `BACKUP_SCRIPT_COMMAND_${key.toUpperCase().replace(/-/g, "_")}`
return process.env[envKey] || null return process.env[envKey] || getDefaultBackupScriptCommands()[key] || null
} }

View File

@@ -1,20 +1,23 @@
import { execFile } from "node:child_process" import {execFile} from "node:child_process"
import {process} from "std-env";
import folderMap from "#server/config/backup-folders.json"; import folderMap from "#server/config/backup-folders.json";
export const REMOTE_HOST = process.env.BACKUPS_REMOTE_HOST export const REMOTE_HOST = process.env.BACKUPS_REMOTE_HOST
export const REMOTE_ROOT = process.env.BACKUPS_REMOTE_ROOT || "/home/malio-b/backups" export const REMOTE_ROOT = process.env.BACKUPS_REMOTE_ROOT || "/home/malio-b/backups"
export const FOLDER_MAP = folderMap as Record<string, string> export const FOLDER_MAP = folderMap as Record<string, string>
export const isSafeFolder = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
export const isSafeFile = (value: string) => /^[a-zA-Z0-9._-]+$/.test(value)
export const shellQuote = (value: string) => `'${value.replace(/'/g, `'\\''`)}'` export const shellQuote = (value: string) => `'${value.replace(/'/g, `'\\''`)}'`
export function runSsh(command: string): Promise<string> { export function runSsh(command: string): Promise<string> {
if (!REMOTE_HOST) {
return Promise.reject(new Error("BACKUPS_REMOTE_HOST is not configured"))
}
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
execFile( execFile(
"ssh", "ssh",
["-o", "BatchMode=yes", "-o", "ConnectTimeout=5", REMOTE_HOST, command], ["-o", "BatchMode=yes", "-o", "ConnectTimeout=5", REMOTE_HOST, command],
{ maxBuffer: 10 * 1024 * 1024 }, {maxBuffer: 10 * 1024 * 1024},
(error, stdout, stderr) => { (error, stdout, stderr) => {
if (error) { if (error) {
reject(stderr || error.message) reject(stderr || error.message)
@@ -42,7 +45,7 @@ export async function resolveFolderRemoteDir(folderName: string): Promise<string
return direct return direct
} }
const nested = `${REMOTE_ROOT}/bdd_recette/${folderName}` const nested = `${REMOTE_ROOT}/bdd-recette/${folderName}`
if (await remoteDirExists(nested)) { if (await remoteDirExists(nested)) {
return nested return nested
} }

235
solution.md Normal file
View File

@@ -0,0 +1,235 @@
# Correctifs finaux de deploiement Supervisor
Ce document resume les correctifs finaux identifies pour faire fonctionner `Supervisor` en production sur `recette`.
## 1. Lancement en production
`Supervisor` n'est pas un site statique simple. Le projet contient :
- des routes serveur dans `server/api/*`
- des middlewares dans `server/middleware/*`
- un plugin serveur dans `server/plugins/metrics-worker.ts`
Il faut donc :
```bash
npm run build
node .output/server/index.mjs
```
En production, l'application a ete lancee via `pm2`.
## 2. Configuration Nginx
Le projet doit etre expose en reverse proxy vers le serveur Node sur `127.0.0.1:3000`.
Configuration minimale valide :
```nginx
server {
listen 80;
server_name supervisor.malio-dev.fr;
client_max_body_size 200M;
client_body_timeout 300s;
send_timeout 300s;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### Pourquoi
- sans reverse proxy, les endpoints `/api/*` ne fonctionnent pas
- sans `client_max_body_size`, le speedtest d'upload retourne `413 Request Entity Too Large`
Apres modification :
```bash
nginx -t
systemctl reload nginx
```
## 3. Cookie d'authentification en HTTP
Le projet etait configure pour utiliser un cookie `Secure` en production, ce qui bloquait toutes les routes `/api/*` en HTTP avec des erreurs `401`.
Correctif applique dans `server/middleware/auth-cookie.ts` :
- le flag `secure` du cookie depend maintenant de `AUTH_COOKIE_SECURE`
Valeur a mettre en HTTP :
```env
AUTH_COOKIE_SECURE=false
```
Si un passage en HTTPS est fait plus tard :
```env
AUTH_COOKIE_SECURE=true
```
## 4. Variables d'environnement a utiliser
Exemple de `.env` fonctionnel :
```env
API_SECRET_KEY=...
DISCORD_BOT_TOKEN=...
DISCORD_CHANNEL_ID=...
BACKUPS_REMOTE_HOST=malio-b
BACKUPS_REMOTE_ROOT=/home/malio-b/backups
BACKUPS_MAX_FILES=200
DISK_COMMAND_LOCAL="cd /home/malio/Malio-ops/CheckStorage && bash check-storage.sh"
DISK_COMMAND_REMOTE="ssh malio-b \"cd /home/malio-b/Malio-ops/CheckStorage && bash check-storage.sh\""
BACKUP_SCRIPT_COMMAND_BACKUP_BDD_RECETTE="cd /home/malio/Malio-ops/RecetteScripts && bash backup-bdd-recette.sh"
BACKUP_SCRIPT_COMMAND_CHECK_STATUT_RECETTE="cd /home/malio/Malio-ops/RecetteScripts && bash check-statut-recette.sh"
BACKUP_SCRIPT_COMMAND_BACKUP_VAULTWARDEN="ssh bitwarden \"bash -lc 'cd /home/matt/vaultwarden/Malio-ops/BackupVaultWarden && ./backup-vaultwarden.sh'\""
BACKUPS_HOUR=19
AUTH_COOKIE_SECURE=false
```
### Important
Les commandes qui contiennent des espaces, `&&` ou des guillemets doivent etre entourees correctement dans le `.env`.
Le format suivant a provoque des erreurs lors d'un `source .env` :
```env
DISK_COMMAND_LOCAL=bash -lc '...'
```
Le shell l'interpretait comme une commande, pas comme une simple valeur.
## 5. PM2
Les variables ajoutees dans `.env` n'etaient pas toujours reprises par le process PM2 deja lance.
Sequence fiable :
```bash
cd /var/www/Supervisor
set -a
source .env
set +a
pm2 kill
pm2 start .output/server/index.mjs --name supervisor
pm2 save
```
Verification utile :
```bash
pm2 list
pm2 env 0 | grep DISK_COMMAND
```
## 6. Backups recette
Comme `Supervisor` tourne deja sur `ferme` / `recette`, les scripts de backup recette ne doivent pas repasser par `ssh ferme`.
Correct :
```env
BACKUP_SCRIPT_COMMAND_BACKUP_BDD_RECETTE="cd /home/malio/Malio-ops/RecetteScripts && bash backup-bdd-recette.sh"
BACKUP_SCRIPT_COMMAND_CHECK_STATUT_RECETTE="cd /home/malio/Malio-ops/RecetteScripts && bash check-statut-recette.sh"
```
La connexion SSH reste necessaire uniquement pour `vaultwarden`.
## 7. SSH vers vaultwarden
La commande distante utilisee est :
```env
BACKUP_SCRIPT_COMMAND_BACKUP_VAULTWARDEN="ssh bitwarden \"bash -lc 'cd /home/matt/vaultwarden/Malio-ops/BackupVaultWarden && ./backup-vaultwarden.sh'\""
```
Cela implique :
- une cle SSH disponible pour l'utilisateur qui lance `Supervisor`
- la cle publique autorisee sur `vault.lpc-liot.fr`
- une resolution correcte de l'alias `bitwarden` ou l'utilisation directe de `matt@vault.lpc-liot.fr`
Exemple de test :
```bash
ssh matt@vault.lpc-liot.fr "hostname"
```
## 8. Commandes disque
Les diagrammes de stockage dependent de :
- `DISK_COMMAND_LOCAL`
- `DISK_COMMAND_REMOTE`
Valeurs fonctionnelles :
```env
DISK_COMMAND_LOCAL="cd /home/malio/Malio-ops/CheckStorage && bash check-storage.sh"
DISK_COMMAND_REMOTE="ssh malio-b \"cd /home/malio-b/Malio-ops/CheckStorage && bash check-storage.sh\""
```
Le script local avait aussi un probleme de droits d'execution. Il a fallu le rendre executable.
Exemple :
```bash
chmod +x /home/malio/Malio-ops/CheckStorage/check-storage.sh
```
## 9. Commandes de verification utiles
Verifier le retour de l'API disque :
```bash
curl -s http://127.0.0.1:3000/api/disk -H "Authorization: Bearer <API_SECRET_KEY>"
```
Verifier le backup status :
```bash
curl -s http://127.0.0.1:3000/api/check-backup -H "Authorization: Bearer <API_SECRET_KEY>"
```
Verifier le process PM2 :
```bash
pm2 list
pm2 logs 0 --lines 100
```
Verifier la configuration Nginx chargee :
```bash
nginx -T
grep -R "supervisor.malio-dev.fr" /etc/nginx
```
## 10. Cause des principaux problemes rencontres
- erreurs `401` : cookie d'auth `Secure` alors que le site etait en HTTP
- erreurs `413` : absence de `client_max_body_size` dans le vhost Nginx
- `ssh undefined` : variable `BACKUPS_REMOTE_HOST` non prise en compte dans le process lance
- diagrammes vides : `DISK_COMMAND_LOCAL` et `DISK_COMMAND_REMOTE` absentes ou mal chargees
- commandes `.env` non lues correctement : quoting incorrect pour des commandes shell complexes
- stockage local vide : script local non executable
## 11. Point de securite
Des secrets ont ete affiches pendant le debug :
- `API_SECRET_KEY`
- `DISCORD_BOT_TOKEN`
Ils doivent etre consideres comme compromis et regeneres.