Contributing
Last updated
Was this helpful?
Last updated
Was this helpful?
This document provides comprehensive guidance for developers working on the LearnCard ecosystem, covering local development setup, testing procedures, continuous integration/deployment pipelines, and production deployment strategies. It serves as a technical reference for contributors and maintainers of the LearnCard codebase.
To contribute to the LearnCard codebase, you'll need the following prerequisites:
Node.js: Version 20.10.0 (as specified in .nvmrc
)
pnpm: Version 9 (package manager)
Git: For version control
The LearnCard repository is organized as a monorepo managed with NX. This structure allows for efficient management of multiple packages and services while sharing dependencies and build configurations.
Clone the repository:
Set up Node.js version:
Install dependencies:
Run tests for affected packages:
Build packages:
LearnCard uses GitHub Actions for automated testing, deployment, and releases. The CI/CD pipeline handles testing for all pull requests and manages deployments to AWS when changes are merged to the main branch.
All pull requests trigger a test workflow that runs tests for affected packages. The workflow:
Checks out the repository
Sets up Node.js and pnpm
Installs dependencies
Runs tests with retries in case of flaky tests
Reports test results
When changes are merged to the main branch, the deploy workflow:
Runs tests to verify the changes
Determines which services are affected
Deploys updated services to AWS using the Serverless Framework
Environment variables are securely provided through GitHub Secrets
The deployment targets two main services:
Brain Service: "LearnCloud Network API"
LearnCloud Service: "LearnCloud Storage API"
The release workflow:
Runs after successful deployment
Builds all libraries
Uses Changesets to create a release PR or publish to npm
When a Changeset release PR is merged, triggers Docker image builds
LearnCard backend services are deployed as serverless applications on AWS, primarily using Lambda functions, API Gateway, and various supporting services.
The services are configured using the Serverless Framework, which manages the AWS resources. Key features:
Functions: Multiple Lambda functions serve different endpoints
VPC Configuration: Services run in a private subnet with NAT gateway access
ElastiCache: Redis cache for improved performance
Security Groups: Control network access between components
The deployment process uses numerous environment variables to configure the services securely. These include:
AWS Access
AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
AWS authentication
Database
NEO4J_URI
, NEO4J_USERNAME
, NEO4J_PASSWORD
Neo4j database access
Redis
REDIS_HOST
, REDIS_PORT
Redis cache configuration
Cryptographic
SEED
, LEARN_CLOUD_SEED
, JWT_SIGNING_KEY
Secure key material
Monitoring
SENTRY_DSN
, DD_API_KEY
Error tracking and monitoring
LearnCard services are also published as Docker images to Docker Hub, making them easier to deploy in containerized environments.
The Docker release process:
Triggered when a Changeset release PR is merged to main
Extracts version information from package.json files
Builds Docker images for Brain Service and LearnCloud Service
Tags images with semantic version numbers
Pushes images to Docker Hub
Available images:
welibrary/lcn-brain-service
: LearnCloud Network API container
welibrary/lcn-cloud-service
: LearnCloud Storage API container
The repository includes a "Maid Service" workflow that automatically cleans up the codebase when necessary. This workflow:
Runs after pushes to the main branch
Checks for unintended file changes that weren't committed
Creates an automated PR to clean up the worktree if necessary
This helps maintain a clean repository state, especially when automation scripts modify files.