mirror of
https://github.com/samsonjs/samhuri.net.git
synced 2026-04-25 14:37:47 +00:00
Split bake.rb into namespaced task files
Tasks are now namespaced by concern: build:*, draft:*, publish:*, quality:*. bake.rb is reduced to load path setup and the default task.
This commit is contained in:
parent
30b7485709
commit
744b6b1204
7 changed files with 531 additions and 523 deletions
6
.github/workflows/ci.yml
vendored
6
.github/workflows/ci.yml
vendored
|
|
@ -24,7 +24,7 @@ jobs:
|
|||
run: bin/bootstrap
|
||||
|
||||
- name: Coverage
|
||||
run: bundle exec bake coverage
|
||||
run: bundle exec bake quality:coverage
|
||||
|
||||
lint:
|
||||
runs-on: ubuntu-latest
|
||||
|
|
@ -43,7 +43,7 @@ jobs:
|
|||
run: bin/bootstrap
|
||||
|
||||
- name: Lint
|
||||
run: bundle exec bake lint
|
||||
run: bundle exec bake quality:lint
|
||||
|
||||
debug:
|
||||
runs-on: ubuntu-latest
|
||||
|
|
@ -62,4 +62,4 @@ jobs:
|
|||
run: bin/bootstrap
|
||||
|
||||
- name: Debug Build
|
||||
run: bundle exec bake debug
|
||||
run: bundle exec bake build:debug
|
||||
|
|
|
|||
52
AGENTS.md
52
AGENTS.md
|
|
@ -4,7 +4,7 @@
|
|||
This repository is a Ruby static-site generator (Pressa) that outputs both HTML and Gemini formats.
|
||||
|
||||
- Generator code: `lib/pressa/` (entrypoint: `lib/pressa.rb`)
|
||||
- Build/deploy/draft tasks: `bake.rb`
|
||||
- Build/publish/draft tasks: `bake.rb` (root) + `bake/` (namespaced task files)
|
||||
- Tests: `test/`
|
||||
- Site config: `site.toml`, `projects.toml`
|
||||
- Published posts: `posts/YYYY/MM/*.md`
|
||||
|
|
@ -20,26 +20,26 @@ Keep new code under the existing `Pressa` module structure (for example `lib/pre
|
|||
## Setup, Build, Test, and Development Commands
|
||||
- Use `rbenv exec` for Ruby commands in this repository (for example `rbenv exec bundle exec ...`) to ensure the project Ruby version is used.
|
||||
- `bin/bootstrap`: install prerequisites and gems (uses `rbenv` when available).
|
||||
- `rbenv exec bundle exec bake debug`: build HTML for `http://localhost:8000` into `www/`.
|
||||
- `rbenv exec bundle exec bake serve`: serve `www/` via WEBrick on port 8000.
|
||||
- `rbenv exec bundle exec bake watch target=debug`: Linux-only autorebuild loop (`inotifywait` required).
|
||||
- `rbenv exec bundle exec bake mudge|beta|release`: build HTML with environment-specific base URLs.
|
||||
- `rbenv exec bundle exec bake gemini`: build Gemini capsule into `gemini/`.
|
||||
- `rbenv exec bundle exec bake publish_beta`: build and rsync `www/` to beta host.
|
||||
- `rbenv exec bundle exec bake publish_gemini`: build and rsync `gemini/` to production host.
|
||||
- `rbenv exec bundle exec bake publish`: build and rsync both HTML and Gemini to production.
|
||||
- `rbenv exec bundle exec bake clean`: remove `www/` and `gemini/`.
|
||||
- `rbenv exec bundle exec bake test`: run test suite.
|
||||
- `rbenv exec bundle exec bake guard`: run Guard for continuous testing.
|
||||
- `rbenv exec bundle exec bake lint`: lint code with StandardRB.
|
||||
- `rbenv exec bundle exec bake lint_fix`: auto-fix lint issues.
|
||||
- `rbenv exec bundle exec bake coverage`: run tests and report `lib/` line coverage.
|
||||
- `rbenv exec bundle exec bake coverage_regression baseline=merge-base`: compare coverage to a baseline and fail on regression (override `baseline` as needed).
|
||||
- `rbenv exec bundle exec bake build:debug`: build HTML for `http://localhost:8000` into `www/`.
|
||||
- `rbenv exec bundle exec bake build:serve`: serve `www/` via WEBrick on port 8000.
|
||||
- `rbenv exec bundle exec bake build:watch target=debug`: Linux-only autorebuild loop (`inotifywait` required).
|
||||
- `rbenv exec bundle exec bake build:mudge|build:beta|build:release`: build HTML with environment-specific base URLs.
|
||||
- `rbenv exec bundle exec bake build:gemini`: build Gemini capsule into `gemini/`.
|
||||
- `rbenv exec bundle exec bake publish:beta`: build and rsync `www/` to beta host.
|
||||
- `rbenv exec bundle exec bake publish:gemini`: build and rsync `gemini/` to production host.
|
||||
- `rbenv exec bundle exec bake publish:production`: build and rsync both HTML and Gemini to production.
|
||||
- `rbenv exec bundle exec bake build:clean`: remove `www/` and `gemini/`.
|
||||
- `rbenv exec bundle exec bake quality:test`: run test suite.
|
||||
- `rbenv exec bundle exec bake quality:guard`: run Guard for continuous testing.
|
||||
- `rbenv exec bundle exec bake quality:lint`: lint code with StandardRB.
|
||||
- `rbenv exec bundle exec bake quality:lint_fix`: auto-fix lint issues.
|
||||
- `rbenv exec bundle exec bake quality:coverage`: run tests and report `lib/` line coverage.
|
||||
- `rbenv exec bundle exec bake quality:coverage_regression baseline=merge-base`: compare coverage to a baseline and fail on regression (override `baseline` as needed).
|
||||
|
||||
## Draft Workflow
|
||||
- `rbenv exec bundle exec bake new_draft "Post Title"` creates `public/drafts/<slug>.md`.
|
||||
- `rbenv exec bundle exec bake drafts` lists available drafts.
|
||||
- `rbenv exec bundle exec bake publish_draft public/drafts/<slug>.md` moves draft to `posts/YYYY/MM/` and updates `Date` and `Timestamp`.
|
||||
- `rbenv exec bundle exec bake draft:new "Post Title"` creates `public/drafts/<slug>.md`.
|
||||
- `rbenv exec bundle exec bake draft:list` lists available drafts.
|
||||
- `rbenv exec bundle exec bake draft:publish public/drafts/<slug>.md` moves draft to `posts/YYYY/MM/` and updates `Date` and `Timestamp`.
|
||||
|
||||
## Content and Metadata Requirements
|
||||
Posts must include YAML front matter. Required keys (enforced by `Pressa::Posts::PostMetadata`) are:
|
||||
|
|
@ -53,7 +53,7 @@ Optional keys include `Tags`, `Link`, `Scripts`, and `Styles`.
|
|||
|
||||
## Coding Style & Naming Conventions
|
||||
- Ruby (see `.ruby-version`).
|
||||
- Follow idiomatic Ruby style and keep code `bake lint`-clean.
|
||||
- Follow idiomatic Ruby style and keep code `bake quality:lint`-clean.
|
||||
- Use 2-space indentation and descriptive `snake_case` names for methods/variables, `UpperCamelCase` for classes/modules.
|
||||
- Prefer small, focused classes for plugins, views, renderers, and config loaders.
|
||||
- Do not hand-edit generated files in `www/` or `gemini/`.
|
||||
|
|
@ -62,10 +62,10 @@ Optional keys include `Tags`, `Link`, `Scripts`, and `Styles`.
|
|||
- Use Minitest under `test/` (for example `test/posts`, `test/config`, `test/views`).
|
||||
- Add regression tests for parser, rendering, feed, and generator behavior changes.
|
||||
- Before submitting, run:
|
||||
- `rbenv exec bundle exec bake test`
|
||||
- `rbenv exec bundle exec bake coverage`
|
||||
- `rbenv exec bundle exec bake lint`
|
||||
- `rbenv exec bundle exec bake debug`
|
||||
- `rbenv exec bundle exec bake quality:test`
|
||||
- `rbenv exec bundle exec bake quality:coverage`
|
||||
- `rbenv exec bundle exec bake quality:lint`
|
||||
- `rbenv exec bundle exec bake build:debug`
|
||||
|
||||
## Commit & Pull Request Guidelines
|
||||
- Use concise, imperative commit subjects (history examples: `Fix internal permalink regression in archives`).
|
||||
|
|
@ -74,11 +74,11 @@ Optional keys include `Tags`, `Link`, `Scripts`, and `Styles`.
|
|||
- Include screenshots when changing rendered layout/CSS output.
|
||||
|
||||
## Deployment & Security Notes
|
||||
- Deployment is defined in `bake.rb` via rsync over SSH.
|
||||
- Publish tasks are defined in `bake/publish.rb` via rsync over SSH.
|
||||
- Current publish host is `mudge` with:
|
||||
- production HTML: `/var/www/samhuri.net/public`
|
||||
- beta HTML: `/var/www/beta.samhuri.net/public`
|
||||
- production Gemini: `/var/gemini/samhuri.net`
|
||||
- `bake publish` deploys both HTML and Gemini to production.
|
||||
- `bake publish:production` deploys both HTML and Gemini to production.
|
||||
- Validate `www/` and `gemini/` before publishing to avoid shipping stale assets.
|
||||
- Never commit credentials, SSH keys, or other secrets.
|
||||
|
|
|
|||
496
bake.rb
496
bake.rb
|
|
@ -1,500 +1,8 @@
|
|||
# Build tasks for samhuri.net static site generator
|
||||
|
||||
require "etc"
|
||||
require "fileutils"
|
||||
require "open3"
|
||||
require "tmpdir"
|
||||
|
||||
LIB_PATH = File.expand_path("lib", __dir__).freeze
|
||||
$LOAD_PATH.unshift(LIB_PATH) unless $LOAD_PATH.include?(LIB_PATH)
|
||||
|
||||
DRAFTS_DIR = "public/drafts".freeze
|
||||
PUBLISH_HOST = "mudge".freeze
|
||||
PRODUCTION_PUBLISH_DIR = "/var/www/samhuri.net/public".freeze
|
||||
BETA_PUBLISH_DIR = "/var/www/beta.samhuri.net/public".freeze
|
||||
GEMINI_PUBLISH_DIR = "/var/gemini/samhuri.net".freeze
|
||||
WATCHABLE_DIRECTORIES = %w[public posts lib].freeze
|
||||
LINT_TARGETS = %w[bake.rb Gemfile lib test].freeze
|
||||
BUILD_TARGETS = %w[debug mudge beta release gemini].freeze
|
||||
|
||||
# Generate the site in debug mode (localhost:8000)
|
||||
def debug
|
||||
build("http://localhost:8000", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for the mudge development server
|
||||
def mudge
|
||||
build("http://mudge:8000", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for beta/staging
|
||||
def beta
|
||||
build("https://beta.samhuri.net", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for production
|
||||
def release
|
||||
build("https://samhuri.net", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the Gemini capsule for production
|
||||
def gemini
|
||||
build("https://samhuri.net", output_format: "gemini", target_path: "gemini")
|
||||
end
|
||||
|
||||
# Start local development server
|
||||
def serve
|
||||
require "webrick"
|
||||
server = WEBrick::HTTPServer.new(Port: 8000, DocumentRoot: "www")
|
||||
trap("INT") { server.shutdown }
|
||||
puts "Server running at http://localhost:8000"
|
||||
server.start
|
||||
end
|
||||
|
||||
# Create a new draft in public/drafts/.
|
||||
# @parameter title_parts [Array] Optional title words; defaults to Untitled.
|
||||
def new_draft(*title_parts)
|
||||
title, filename =
|
||||
if title_parts.empty?
|
||||
["Untitled", next_available_draft]
|
||||
else
|
||||
given_title = title_parts.join(" ")
|
||||
slug = slugify(given_title)
|
||||
abort "Error: title cannot be converted to a filename." if slug.empty?
|
||||
|
||||
filename = "#{slug}.md"
|
||||
path = draft_path(filename)
|
||||
abort "Error: draft already exists at #{path}" if File.exist?(path)
|
||||
|
||||
[given_title, filename]
|
||||
end
|
||||
|
||||
FileUtils.mkdir_p(DRAFTS_DIR)
|
||||
path = draft_path(filename)
|
||||
content = render_draft_template(title)
|
||||
File.write(path, content)
|
||||
|
||||
puts "Created new draft at #{path}"
|
||||
puts ">>> Contents below <<<"
|
||||
puts
|
||||
puts content
|
||||
end
|
||||
|
||||
# Publish a draft by moving it to posts/YYYY/MM and updating dates.
|
||||
# @parameter input_path [String] Draft path or filename in public/drafts.
|
||||
def publish_draft(input_path = nil)
|
||||
if input_path.nil? || input_path.strip.empty?
|
||||
puts "Usage: bake publish_draft <draft-path-or-filename>"
|
||||
puts
|
||||
puts "Available drafts:"
|
||||
drafts = Dir.glob("#{DRAFTS_DIR}/*.md").map { |path| File.basename(path) }
|
||||
if drafts.empty?
|
||||
puts " (no drafts found)"
|
||||
else
|
||||
drafts.each { |draft| puts " #{draft}" }
|
||||
end
|
||||
abort
|
||||
end
|
||||
|
||||
draft_path_value, draft_file = resolve_draft_input(input_path)
|
||||
abort "Error: File not found: #{draft_path_value}" unless File.exist?(draft_path_value)
|
||||
|
||||
now = Time.now
|
||||
content = File.read(draft_path_value)
|
||||
content.sub!(/^Date:.*$/, "Date: #{ordinal_date(now)}")
|
||||
content.sub!(/^Timestamp:.*$/, "Timestamp: #{now.strftime("%Y-%m-%dT%H:%M:%S%:z")}")
|
||||
|
||||
target_dir = "posts/#{now.strftime("%Y/%m")}"
|
||||
FileUtils.mkdir_p(target_dir)
|
||||
target_path = "#{target_dir}/#{draft_file}"
|
||||
|
||||
File.write(target_path, content)
|
||||
FileUtils.rm_f(draft_path_value)
|
||||
|
||||
puts "Published draft: #{draft_path_value} -> #{target_path}"
|
||||
end
|
||||
|
||||
# Watch content directories and rebuild on every change.
|
||||
# @parameter target [String] One of debug, mudge, beta, release, or gemini.
|
||||
def watch(target: "debug")
|
||||
unless command_available?("inotifywait")
|
||||
abort "inotifywait is required (install inotify-tools)."
|
||||
end
|
||||
|
||||
loop do
|
||||
abort "Error: watch failed." unless system("inotifywait", "-e", "modify,create,delete,move", *watch_paths)
|
||||
puts "changed at #{Time.now}"
|
||||
sleep 2
|
||||
run_build_target(target)
|
||||
end
|
||||
end
|
||||
|
||||
# Publish to beta/staging server
|
||||
def publish_beta
|
||||
beta
|
||||
run_rsync(local_paths: ["www/"], publish_dir: BETA_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
end
|
||||
|
||||
# Publish Gemini capsule to production
|
||||
def publish_gemini
|
||||
gemini
|
||||
run_rsync(local_paths: ["gemini/"], publish_dir: GEMINI_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
end
|
||||
|
||||
# Publish to production server
|
||||
def publish
|
||||
release
|
||||
run_rsync(local_paths: ["www/"], publish_dir: PRODUCTION_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
publish_gemini
|
||||
end
|
||||
|
||||
# Clean generated files
|
||||
def clean
|
||||
FileUtils.rm_rf("www")
|
||||
FileUtils.rm_rf("gemini")
|
||||
puts "Cleaned www/ and gemini/ directories"
|
||||
end
|
||||
|
||||
# Default task: run coverage and lint.
|
||||
def default
|
||||
coverage
|
||||
lint
|
||||
end
|
||||
|
||||
# Run Minitest tests
|
||||
def test
|
||||
run_test_suite(test_file_list)
|
||||
end
|
||||
|
||||
# Run Guard for continuous testing
|
||||
def guard
|
||||
exec "bundle exec guard"
|
||||
end
|
||||
|
||||
# List all available drafts
|
||||
def drafts
|
||||
Dir.glob("#{DRAFTS_DIR}/*.md").sort.each do |draft|
|
||||
puts File.basename(draft)
|
||||
end
|
||||
end
|
||||
|
||||
# Run StandardRB linter
|
||||
def lint
|
||||
run_standardrb
|
||||
end
|
||||
|
||||
# Auto-fix StandardRB issues
|
||||
def lint_fix
|
||||
run_standardrb("--fix")
|
||||
end
|
||||
|
||||
# Measure line coverage for files under lib/.
|
||||
# @parameter lowest [Integer] Number of lowest-covered files to print (default: 10, use 0 to hide).
|
||||
def coverage(lowest: 10)
|
||||
lowest_count = Integer(lowest)
|
||||
abort "Error: lowest must be >= 0." if lowest_count.negative?
|
||||
|
||||
run_coverage(test_files: test_file_list, lowest_count:)
|
||||
end
|
||||
|
||||
# Compare line coverage for files under lib/ against a baseline and fail on regression.
|
||||
# @parameter baseline [String] Baseline ref, or "merge-base" (default) to compare against merge-base with remote default branch.
|
||||
# @parameter lowest [Integer] Number of lowest-covered files to print for the current checkout (default: 10, use 0 to hide).
|
||||
def coverage_regression(baseline: "merge-base", lowest: 10)
|
||||
lowest_count = Integer(lowest)
|
||||
abort "Error: lowest must be >= 0." if lowest_count.negative?
|
||||
|
||||
baseline_ref = resolve_coverage_baseline_ref(baseline)
|
||||
baseline_commit = capture_command("git", "rev-parse", "--short", baseline_ref).strip
|
||||
|
||||
puts "Running coverage for current checkout..."
|
||||
current_output = capture_coverage_output(test_files: test_file_list, lowest_count:, chdir: Dir.pwd)
|
||||
print current_output
|
||||
current_percent = parse_coverage_percent(current_output)
|
||||
|
||||
puts "Running coverage for baseline #{baseline_ref} (#{baseline_commit})..."
|
||||
baseline_percent = with_temporary_worktree(ref: baseline_ref) do |worktree_path|
|
||||
baseline_tests = test_file_list(chdir: worktree_path)
|
||||
baseline_output = capture_coverage_output(test_files: baseline_tests, lowest_count: 0, chdir: worktree_path)
|
||||
parse_coverage_percent(baseline_output)
|
||||
end
|
||||
|
||||
delta = current_percent - baseline_percent
|
||||
puts format("Baseline coverage (%s %s): %.2f%%", baseline_ref, baseline_commit, baseline_percent)
|
||||
puts format("Coverage delta: %+0.2f%%", delta)
|
||||
|
||||
return unless delta.negative?
|
||||
|
||||
abort format("Error: coverage regressed by %.2f%% against %s (%s).", -delta, baseline_ref, baseline_commit)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def run_test_suite(test_files)
|
||||
run_command("ruby", "-Ilib", "-Itest", "-e", "ARGV.each { |file| require File.expand_path(file) }", *test_files)
|
||||
end
|
||||
|
||||
def run_coverage(test_files:, lowest_count:)
|
||||
output = capture_coverage_output(test_files:, lowest_count:, chdir: Dir.pwd)
|
||||
print output
|
||||
end
|
||||
|
||||
def test_file_list(chdir: Dir.pwd)
|
||||
test_files = Dir.chdir(chdir) { Dir.glob("test/**/*_test.rb").sort }
|
||||
abort "Error: no tests found in test/**/*_test.rb under #{chdir}" if test_files.empty?
|
||||
|
||||
test_files
|
||||
end
|
||||
|
||||
def coverage_script(lowest_count:)
|
||||
<<~RUBY
|
||||
require "coverage"
|
||||
|
||||
root = Dir.pwd
|
||||
lib_root = File.join(root, "lib") + "/"
|
||||
Coverage.start(lines: true)
|
||||
|
||||
at_exit do
|
||||
result = Coverage.result
|
||||
rows = result.keys
|
||||
.select { |file| file.start_with?(lib_root) && file.end_with?(".rb") }
|
||||
.sort
|
||||
.map do |file|
|
||||
lines = result[file][:lines] || []
|
||||
total = 0
|
||||
covered = 0
|
||||
lines.each do |line_count|
|
||||
next if line_count.nil?
|
||||
total += 1
|
||||
covered += 1 if line_count.positive?
|
||||
end
|
||||
percent = total.zero? ? 100.0 : (covered.to_f / total * 100)
|
||||
[file, covered, total, percent]
|
||||
end
|
||||
|
||||
covered_lines = rows.sum { |row| row[1] }
|
||||
total_lines = rows.sum { |row| row[2] }
|
||||
overall_percent = total_lines.zero? ? 100.0 : (covered_lines.to_f / total_lines * 100)
|
||||
puts format("Coverage (lib): %.2f%% (%d / %d lines)", overall_percent, covered_lines, total_lines)
|
||||
|
||||
unless #{lowest_count}.zero? || rows.empty?
|
||||
puts "Lowest covered files:"
|
||||
rows.sort_by { |row| row[3] }.first(#{lowest_count}).each do |file, covered, total, percent|
|
||||
relative_path = file.delete_prefix(root + "/")
|
||||
puts format(" %6.2f%% %d/%d %s", percent, covered, total, relative_path)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
ARGV.each { |file| require File.expand_path(file) }
|
||||
RUBY
|
||||
end
|
||||
|
||||
def capture_coverage_output(test_files:, lowest_count:, chdir:)
|
||||
capture_command("ruby", "-Ilib", "-Itest", "-e", coverage_script(lowest_count:), *test_files, chdir:)
|
||||
end
|
||||
|
||||
def parse_coverage_percent(output)
|
||||
match = output.match(/Coverage \(lib\):\s+([0-9]+\.[0-9]+)%/)
|
||||
abort "Error: unable to parse coverage output." unless match
|
||||
|
||||
Float(match[1])
|
||||
end
|
||||
|
||||
def resolve_coverage_baseline_ref(baseline)
|
||||
baseline_name = baseline.to_s.strip
|
||||
abort "Error: baseline cannot be empty." if baseline_name.empty?
|
||||
|
||||
return coverage_merge_base_ref if baseline_name == "merge-base"
|
||||
|
||||
baseline_name
|
||||
end
|
||||
|
||||
def coverage_merge_base_ref
|
||||
remote = preferred_remote
|
||||
remote_head_ref = remote_default_branch_ref(remote)
|
||||
merge_base = capture_command("git", "merge-base", "HEAD", remote_head_ref).strip
|
||||
abort "Error: could not resolve merge-base with #{remote_head_ref}." if merge_base.empty?
|
||||
|
||||
merge_base
|
||||
end
|
||||
|
||||
def preferred_remote
|
||||
upstream = capture_command_optional("git", "rev-parse", "--abbrev-ref", "--symbolic-full-name", "@{upstream}").strip
|
||||
upstream_remote = upstream.split("/").first unless upstream.empty?
|
||||
return upstream_remote if upstream_remote && !upstream_remote.empty?
|
||||
|
||||
remotes = capture_command("git", "remote").lines.map(&:strip).reject(&:empty?)
|
||||
abort "Error: no git remotes configured; pass baseline=<ref>." if remotes.empty?
|
||||
|
||||
remotes.include?("origin") ? "origin" : remotes.first
|
||||
end
|
||||
|
||||
def remote_default_branch_ref(remote)
|
||||
symbolic = capture_command_optional("git", "symbolic-ref", "--quiet", "refs/remotes/#{remote}/HEAD").strip
|
||||
if symbolic.empty?
|
||||
fallback = "#{remote}/main"
|
||||
capture_command("git", "rev-parse", "--verify", fallback)
|
||||
return fallback
|
||||
end
|
||||
|
||||
symbolic.sub("refs/remotes/", "")
|
||||
end
|
||||
|
||||
def with_temporary_worktree(ref:)
|
||||
temp_root = Dir.mktmpdir("coverage-baseline-")
|
||||
worktree_path = File.join(temp_root, "worktree")
|
||||
|
||||
run_command("git", "worktree", "add", "--detach", worktree_path, ref)
|
||||
begin
|
||||
yield worktree_path
|
||||
ensure
|
||||
system("git", "worktree", "remove", "--force", worktree_path)
|
||||
FileUtils.rm_rf(temp_root)
|
||||
end
|
||||
end
|
||||
|
||||
def capture_command(*command, chdir: Dir.pwd)
|
||||
stdout, stderr, status = Dir.chdir(chdir) { Open3.capture3(*command) }
|
||||
output = +""
|
||||
output << stdout unless stdout.empty?
|
||||
output << stderr unless stderr.empty?
|
||||
abort "Error: command failed: #{command.join(" ")}\n#{output}" unless status.success?
|
||||
|
||||
output
|
||||
end
|
||||
|
||||
def capture_command_optional(*command, chdir: Dir.pwd)
|
||||
stdout, stderr, status = Dir.chdir(chdir) { Open3.capture3(*command) }
|
||||
return stdout if status.success?
|
||||
return "" if stderr.include?("no upstream configured") || stderr.include?("is not a symbolic ref")
|
||||
|
||||
""
|
||||
end
|
||||
|
||||
# Build the site with specified URL and output format.
|
||||
# @parameter url [String] The site URL to use.
|
||||
# @parameter output_format [String] One of html or gemini.
|
||||
# @parameter target_path [String] Target directory for generated output.
|
||||
def build(url, output_format:, target_path:)
|
||||
require "pressa"
|
||||
|
||||
puts "Building #{output_format} site for #{url}..."
|
||||
site = Pressa.create_site(source_path: ".", url_override: url, output_format:)
|
||||
generator = Pressa::SiteGenerator.new(site:)
|
||||
generator.generate(source_path: ".", target_path:)
|
||||
puts "Site built successfully in #{target_path}/"
|
||||
end
|
||||
|
||||
def run_build_target(target)
|
||||
target_name = target.to_s
|
||||
unless BUILD_TARGETS.include?(target_name)
|
||||
abort "Error: invalid target '#{target_name}'. Use one of: #{BUILD_TARGETS.join(", ")}"
|
||||
end
|
||||
|
||||
public_send(target_name)
|
||||
end
|
||||
|
||||
def watch_paths
|
||||
WATCHABLE_DIRECTORIES.flat_map { |path| ["-r", path] }
|
||||
end
|
||||
|
||||
def standardrb_command(*extra_args)
|
||||
["bundle", "exec", "standardrb", *extra_args, *LINT_TARGETS]
|
||||
end
|
||||
|
||||
def run_standardrb(*extra_args)
|
||||
run_command(*standardrb_command(*extra_args))
|
||||
end
|
||||
|
||||
def run_command(*command)
|
||||
abort "Error: command failed: #{command.join(" ")}" unless system(*command)
|
||||
end
|
||||
|
||||
def run_rsync(local_paths:, publish_dir:, dry_run:, delete:)
|
||||
command = ["rsync", "-aKv", "-e", "ssh -4"]
|
||||
command << "--dry-run" if dry_run
|
||||
command << "--delete" if delete
|
||||
command.concat(local_paths)
|
||||
command << "#{PUBLISH_HOST}:#{publish_dir}"
|
||||
abort "Error: rsync failed." unless system(*command)
|
||||
end
|
||||
|
||||
def resolve_draft_input(input_path)
|
||||
if input_path.include?("/")
|
||||
if input_path.start_with?("posts/")
|
||||
abort "Error: '#{input_path}' is already published in posts/ directory"
|
||||
end
|
||||
|
||||
[input_path, File.basename(input_path)]
|
||||
else
|
||||
[draft_path(input_path), input_path]
|
||||
end
|
||||
end
|
||||
|
||||
def draft_path(filename)
|
||||
File.join(DRAFTS_DIR, filename)
|
||||
end
|
||||
|
||||
def slugify(title)
|
||||
title.downcase
|
||||
.gsub(/[^a-z0-9\s-]/, "")
|
||||
.gsub(/\s+/, "-").squeeze("-")
|
||||
.gsub(/^-|-$/, "")
|
||||
end
|
||||
|
||||
def next_available_draft(base_filename = "untitled.md")
|
||||
return base_filename unless File.exist?(draft_path(base_filename))
|
||||
|
||||
name_without_ext = File.basename(base_filename, ".md")
|
||||
counter = 1
|
||||
loop do
|
||||
numbered_filename = "#{name_without_ext}-#{counter}.md"
|
||||
return numbered_filename unless File.exist?(draft_path(numbered_filename))
|
||||
|
||||
counter += 1
|
||||
end
|
||||
end
|
||||
|
||||
def render_draft_template(title)
|
||||
now = Time.now
|
||||
<<~FRONTMATTER
|
||||
---
|
||||
Author: #{current_author}
|
||||
Title: #{title}
|
||||
Date: unpublished
|
||||
Timestamp: #{now.strftime("%Y-%m-%dT%H:%M:%S%:z")}
|
||||
Tags:
|
||||
---
|
||||
|
||||
# #{title}
|
||||
|
||||
TKTK
|
||||
FRONTMATTER
|
||||
end
|
||||
|
||||
def current_author
|
||||
Etc.getlogin || ENV["USER"] || `whoami`.strip
|
||||
rescue
|
||||
ENV["USER"] || `whoami`.strip
|
||||
end
|
||||
|
||||
def ordinal_date(time)
|
||||
day = time.day
|
||||
suffix = case day
|
||||
when 1, 21, 31
|
||||
"st"
|
||||
when 2, 22
|
||||
"nd"
|
||||
when 3, 23
|
||||
"rd"
|
||||
else
|
||||
"th"
|
||||
end
|
||||
|
||||
time.strftime("#{day}#{suffix} %B, %Y")
|
||||
end
|
||||
|
||||
def command_available?(command)
|
||||
system("which", command, out: File::NULL, err: File::NULL)
|
||||
call("quality:coverage")
|
||||
call("quality:lint")
|
||||
end
|
||||
|
|
|
|||
93
bake/build.rb
Normal file
93
bake/build.rb
Normal file
|
|
@ -0,0 +1,93 @@
|
|||
require "fileutils"
|
||||
|
||||
WATCHABLE_DIRECTORIES = %w[public posts lib].freeze
|
||||
BUILD_TARGETS = %w[debug mudge beta release gemini].freeze
|
||||
|
||||
# Generate the site in debug mode (localhost:8000)
|
||||
def debug
|
||||
build("http://localhost:8000", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for the mudge development server
|
||||
def mudge
|
||||
build("http://mudge:8000", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for beta/staging
|
||||
def beta
|
||||
build("https://beta.samhuri.net", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the site for production
|
||||
def release
|
||||
build("https://samhuri.net", output_format: "html", target_path: "www")
|
||||
end
|
||||
|
||||
# Generate the Gemini capsule for production
|
||||
def gemini
|
||||
build("https://samhuri.net", output_format: "gemini", target_path: "gemini")
|
||||
end
|
||||
|
||||
# Start local development server
|
||||
def serve
|
||||
require "webrick"
|
||||
server = WEBrick::HTTPServer.new(Port: 8000, DocumentRoot: "www")
|
||||
trap("INT") { server.shutdown }
|
||||
puts "Server running at http://localhost:8000"
|
||||
server.start
|
||||
end
|
||||
|
||||
# Clean generated files
|
||||
def clean
|
||||
FileUtils.rm_rf("www")
|
||||
FileUtils.rm_rf("gemini")
|
||||
puts "Cleaned www/ and gemini/ directories"
|
||||
end
|
||||
|
||||
# Watch content directories and rebuild on every change.
|
||||
# @parameter target [String] One of debug, mudge, beta, release, or gemini.
|
||||
def watch(target: "debug")
|
||||
unless command_available?("inotifywait")
|
||||
abort "inotifywait is required (install inotify-tools)."
|
||||
end
|
||||
|
||||
loop do
|
||||
abort "Error: watch failed." unless system("inotifywait", "-e", "modify,create,delete,move", *watch_paths)
|
||||
puts "changed at #{Time.now}"
|
||||
sleep 2
|
||||
run_build_target(target)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Build the site with specified URL and output format.
|
||||
# @parameter url [String] The site URL to use.
|
||||
# @parameter output_format [String] One of html or gemini.
|
||||
# @parameter target_path [String] Target directory for generated output.
|
||||
def build(url, output_format:, target_path:)
|
||||
require "pressa"
|
||||
|
||||
puts "Building #{output_format} site for #{url}..."
|
||||
site = Pressa.create_site(source_path: ".", url_override: url, output_format:)
|
||||
generator = Pressa::SiteGenerator.new(site:)
|
||||
generator.generate(source_path: ".", target_path:)
|
||||
puts "Site built successfully in #{target_path}/"
|
||||
end
|
||||
|
||||
def run_build_target(target)
|
||||
target_name = target.to_s
|
||||
unless BUILD_TARGETS.include?(target_name)
|
||||
abort "Error: invalid target '#{target_name}'. Use one of: #{BUILD_TARGETS.join(", ")}"
|
||||
end
|
||||
|
||||
public_send(target_name)
|
||||
end
|
||||
|
||||
def watch_paths
|
||||
WATCHABLE_DIRECTORIES.flat_map { |path| ["-r", path] }
|
||||
end
|
||||
|
||||
def command_available?(command)
|
||||
system("which", command, out: File::NULL, err: File::NULL)
|
||||
end
|
||||
152
bake/draft.rb
Normal file
152
bake/draft.rb
Normal file
|
|
@ -0,0 +1,152 @@
|
|||
require "etc"
|
||||
require "fileutils"
|
||||
|
||||
DRAFTS_DIR = "public/drafts".freeze
|
||||
|
||||
# Create a new draft in public/drafts/.
|
||||
# @parameter title_parts [Array] Optional title words; defaults to Untitled.
|
||||
def new(*title_parts)
|
||||
title, filename =
|
||||
if title_parts.empty?
|
||||
["Untitled", next_available_draft]
|
||||
else
|
||||
given_title = title_parts.join(" ")
|
||||
slug = slugify(given_title)
|
||||
abort "Error: title cannot be converted to a filename." if slug.empty?
|
||||
|
||||
filename = "#{slug}.md"
|
||||
path = draft_path(filename)
|
||||
abort "Error: draft already exists at #{path}" if File.exist?(path)
|
||||
|
||||
[given_title, filename]
|
||||
end
|
||||
|
||||
FileUtils.mkdir_p(DRAFTS_DIR)
|
||||
path = draft_path(filename)
|
||||
content = render_draft_template(title)
|
||||
File.write(path, content)
|
||||
|
||||
puts "Created new draft at #{path}"
|
||||
puts ">>> Contents below <<<"
|
||||
puts
|
||||
puts content
|
||||
end
|
||||
|
||||
# Publish a draft by moving it to posts/YYYY/MM and updating dates.
|
||||
# @parameter input_path [String] Draft path or filename in public/drafts.
|
||||
def publish(input_path = nil)
|
||||
if input_path.nil? || input_path.strip.empty?
|
||||
puts "Usage: bake draft:publish <draft-path-or-filename>"
|
||||
puts
|
||||
puts "Available drafts:"
|
||||
drafts = Dir.glob("#{DRAFTS_DIR}/*.md").map { |path| File.basename(path) }
|
||||
if drafts.empty?
|
||||
puts " (no drafts found)"
|
||||
else
|
||||
drafts.each { |draft| puts " #{draft}" }
|
||||
end
|
||||
abort
|
||||
end
|
||||
|
||||
draft_path_value, draft_file = resolve_draft_input(input_path)
|
||||
abort "Error: File not found: #{draft_path_value}" unless File.exist?(draft_path_value)
|
||||
|
||||
now = Time.now
|
||||
content = File.read(draft_path_value)
|
||||
content.sub!(/^Date:.*$/, "Date: #{ordinal_date(now)}")
|
||||
content.sub!(/^Timestamp:.*$/, "Timestamp: #{now.strftime("%Y-%m-%dT%H:%M:%S%:z")}")
|
||||
|
||||
target_dir = "posts/#{now.strftime("%Y/%m")}"
|
||||
FileUtils.mkdir_p(target_dir)
|
||||
target_path = "#{target_dir}/#{draft_file}"
|
||||
|
||||
File.write(target_path, content)
|
||||
FileUtils.rm_f(draft_path_value)
|
||||
|
||||
puts "Published draft: #{draft_path_value} -> #{target_path}"
|
||||
end
|
||||
|
||||
# List all available drafts
|
||||
def list
|
||||
Dir.glob("#{DRAFTS_DIR}/*.md").sort.each do |draft|
|
||||
puts File.basename(draft)
|
||||
end
|
||||
nil
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def resolve_draft_input(input_path)
|
||||
if input_path.include?("/")
|
||||
if input_path.start_with?("posts/")
|
||||
abort "Error: '#{input_path}' is already published in posts/ directory"
|
||||
end
|
||||
|
||||
[input_path, File.basename(input_path)]
|
||||
else
|
||||
[draft_path(input_path), input_path]
|
||||
end
|
||||
end
|
||||
|
||||
def draft_path(filename)
|
||||
File.join(DRAFTS_DIR, filename)
|
||||
end
|
||||
|
||||
def slugify(title)
|
||||
title.downcase
|
||||
.gsub(/[^a-z0-9\s-]/, "")
|
||||
.gsub(/\s+/, "-").squeeze("-")
|
||||
.gsub(/^-|-$/, "")
|
||||
end
|
||||
|
||||
def next_available_draft(base_filename = "untitled.md")
|
||||
return base_filename unless File.exist?(draft_path(base_filename))
|
||||
|
||||
name_without_ext = File.basename(base_filename, ".md")
|
||||
counter = 1
|
||||
loop do
|
||||
numbered_filename = "#{name_without_ext}-#{counter}.md"
|
||||
return numbered_filename unless File.exist?(draft_path(numbered_filename))
|
||||
|
||||
counter += 1
|
||||
end
|
||||
end
|
||||
|
||||
def render_draft_template(title)
|
||||
now = Time.now
|
||||
<<~FRONTMATTER
|
||||
---
|
||||
Author: #{current_author}
|
||||
Title: #{title}
|
||||
Date: unpublished
|
||||
Timestamp: #{now.strftime("%Y-%m-%dT%H:%M:%S%:z")}
|
||||
Tags:
|
||||
---
|
||||
|
||||
# #{title}
|
||||
|
||||
TKTK
|
||||
FRONTMATTER
|
||||
end
|
||||
|
||||
def current_author
|
||||
Etc.getlogin || ENV["USER"] || `whoami`.strip
|
||||
rescue
|
||||
ENV["USER"] || `whoami`.strip
|
||||
end
|
||||
|
||||
def ordinal_date(time)
|
||||
day = time.day
|
||||
suffix = case day
|
||||
when 1, 21, 31
|
||||
"st"
|
||||
when 2, 22
|
||||
"nd"
|
||||
when 3, 23
|
||||
"rd"
|
||||
else
|
||||
"th"
|
||||
end
|
||||
|
||||
time.strftime("#{day}#{suffix} %B, %Y")
|
||||
end
|
||||
34
bake/publish.rb
Normal file
34
bake/publish.rb
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
PUBLISH_HOST = "mudge".freeze
|
||||
PRODUCTION_PUBLISH_DIR = "/var/www/samhuri.net/public".freeze
|
||||
BETA_PUBLISH_DIR = "/var/www/beta.samhuri.net/public".freeze
|
||||
GEMINI_PUBLISH_DIR = "/var/gemini/samhuri.net".freeze
|
||||
|
||||
# Publish to beta/staging server
|
||||
def beta
|
||||
call("build:beta")
|
||||
run_rsync(local_paths: ["www/"], publish_dir: BETA_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
end
|
||||
|
||||
# Publish Gemini capsule to production
|
||||
def gemini
|
||||
call("build:gemini")
|
||||
run_rsync(local_paths: ["gemini/"], publish_dir: GEMINI_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
end
|
||||
|
||||
# Publish to production server
|
||||
def production
|
||||
call("build:release")
|
||||
run_rsync(local_paths: ["www/"], publish_dir: PRODUCTION_PUBLISH_DIR, dry_run: false, delete: true)
|
||||
call("publish:gemini")
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def run_rsync(local_paths:, publish_dir:, dry_run:, delete:)
|
||||
command = ["rsync", "-aKv", "-e", "ssh -4"]
|
||||
command << "--dry-run" if dry_run
|
||||
command << "--delete" if delete
|
||||
command.concat(local_paths)
|
||||
command << "#{PUBLISH_HOST}:#{publish_dir}"
|
||||
abort "Error: rsync failed." unless system(*command)
|
||||
end
|
||||
221
bake/quality.rb
Normal file
221
bake/quality.rb
Normal file
|
|
@ -0,0 +1,221 @@
|
|||
require "fileutils"
|
||||
require "open3"
|
||||
require "tmpdir"
|
||||
|
||||
LINT_TARGETS = %w[bake.rb Gemfile bake lib test].freeze
|
||||
|
||||
# Run Minitest tests
|
||||
def test
|
||||
run_test_suite(test_file_list)
|
||||
end
|
||||
|
||||
# Run Guard for continuous testing
|
||||
def guard
|
||||
exec "bundle exec guard"
|
||||
end
|
||||
|
||||
# Run StandardRB linter
|
||||
def lint
|
||||
run_standardrb
|
||||
end
|
||||
|
||||
# Auto-fix StandardRB issues
|
||||
def lint_fix
|
||||
run_standardrb("--fix")
|
||||
end
|
||||
|
||||
# Measure line coverage for files under lib/.
|
||||
# @parameter lowest [Integer] Number of lowest-covered files to print (default: 10, use 0 to hide).
|
||||
def coverage(lowest: 10)
|
||||
lowest_count = Integer(lowest)
|
||||
abort "Error: lowest must be >= 0." if lowest_count.negative?
|
||||
|
||||
run_coverage(test_files: test_file_list, lowest_count:)
|
||||
end
|
||||
|
||||
# Compare line coverage for files under lib/ against a baseline and fail on regression.
|
||||
# @parameter baseline [String] Baseline ref, or "merge-base" (default) to compare against merge-base with remote default branch.
|
||||
# @parameter lowest [Integer] Number of lowest-covered files to print for the current checkout (default: 10, use 0 to hide).
|
||||
def coverage_regression(baseline: "merge-base", lowest: 10)
|
||||
lowest_count = Integer(lowest)
|
||||
abort "Error: lowest must be >= 0." if lowest_count.negative?
|
||||
|
||||
baseline_ref = resolve_coverage_baseline_ref(baseline)
|
||||
baseline_commit = capture_command("git", "rev-parse", "--short", baseline_ref).strip
|
||||
|
||||
puts "Running coverage for current checkout..."
|
||||
current_output = capture_coverage_output(test_files: test_file_list, lowest_count:, chdir: Dir.pwd)
|
||||
print current_output
|
||||
current_percent = parse_coverage_percent(current_output)
|
||||
|
||||
puts "Running coverage for baseline #{baseline_ref} (#{baseline_commit})..."
|
||||
baseline_percent = with_temporary_worktree(ref: baseline_ref) do |worktree_path|
|
||||
baseline_tests = test_file_list(chdir: worktree_path)
|
||||
baseline_output = capture_coverage_output(test_files: baseline_tests, lowest_count: 0, chdir: worktree_path)
|
||||
parse_coverage_percent(baseline_output)
|
||||
end
|
||||
|
||||
delta = current_percent - baseline_percent
|
||||
puts format("Baseline coverage (%s %s): %.2f%%", baseline_ref, baseline_commit, baseline_percent)
|
||||
puts format("Coverage delta: %+0.2f%%", delta)
|
||||
|
||||
return unless delta.negative?
|
||||
|
||||
abort format("Error: coverage regressed by %.2f%% against %s (%s).", -delta, baseline_ref, baseline_commit)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def run_test_suite(test_files)
|
||||
run_command("ruby", "-Ilib", "-Itest", "-e", "ARGV.each { |file| require File.expand_path(file) }", *test_files)
|
||||
end
|
||||
|
||||
def run_coverage(test_files:, lowest_count:)
|
||||
output = capture_coverage_output(test_files:, lowest_count:, chdir: Dir.pwd)
|
||||
print output
|
||||
end
|
||||
|
||||
def test_file_list(chdir: Dir.pwd)
|
||||
test_files = Dir.chdir(chdir) { Dir.glob("test/**/*_test.rb").sort }
|
||||
abort "Error: no tests found in test/**/*_test.rb under #{chdir}" if test_files.empty?
|
||||
|
||||
test_files
|
||||
end
|
||||
|
||||
def coverage_script(lowest_count:)
|
||||
<<~RUBY
|
||||
require "coverage"
|
||||
|
||||
root = Dir.pwd
|
||||
lib_root = File.join(root, "lib") + "/"
|
||||
Coverage.start(lines: true)
|
||||
|
||||
at_exit do
|
||||
result = Coverage.result
|
||||
rows = result.keys
|
||||
.select { |file| file.start_with?(lib_root) && file.end_with?(".rb") }
|
||||
.sort
|
||||
.map do |file|
|
||||
lines = result[file][:lines] || []
|
||||
total = 0
|
||||
covered = 0
|
||||
lines.each do |line_count|
|
||||
next if line_count.nil?
|
||||
total += 1
|
||||
covered += 1 if line_count.positive?
|
||||
end
|
||||
percent = total.zero? ? 100.0 : (covered.to_f / total * 100)
|
||||
[file, covered, total, percent]
|
||||
end
|
||||
|
||||
covered_lines = rows.sum { |row| row[1] }
|
||||
total_lines = rows.sum { |row| row[2] }
|
||||
overall_percent = total_lines.zero? ? 100.0 : (covered_lines.to_f / total_lines * 100)
|
||||
puts format("Coverage (lib): %.2f%% (%d / %d lines)", overall_percent, covered_lines, total_lines)
|
||||
|
||||
unless #{lowest_count}.zero? || rows.empty?
|
||||
puts "Lowest covered files:"
|
||||
rows.sort_by { |row| row[3] }.first(#{lowest_count}).each do |file, covered, total, percent|
|
||||
relative_path = file.delete_prefix(root + "/")
|
||||
puts format(" %6.2f%% %d/%d %s", percent, covered, total, relative_path)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
ARGV.each { |file| require File.expand_path(file) }
|
||||
RUBY
|
||||
end
|
||||
|
||||
def capture_coverage_output(test_files:, lowest_count:, chdir:)
|
||||
capture_command("ruby", "-Ilib", "-Itest", "-e", coverage_script(lowest_count:), *test_files, chdir:)
|
||||
end
|
||||
|
||||
def parse_coverage_percent(output)
|
||||
match = output.match(/Coverage \(lib\):\s+([0-9]+\.[0-9]+)%/)
|
||||
abort "Error: unable to parse coverage output." unless match
|
||||
|
||||
Float(match[1])
|
||||
end
|
||||
|
||||
def resolve_coverage_baseline_ref(baseline)
|
||||
baseline_name = baseline.to_s.strip
|
||||
abort "Error: baseline cannot be empty." if baseline_name.empty?
|
||||
|
||||
return coverage_merge_base_ref if baseline_name == "merge-base"
|
||||
|
||||
baseline_name
|
||||
end
|
||||
|
||||
def coverage_merge_base_ref
|
||||
remote = preferred_remote
|
||||
remote_head_ref = remote_default_branch_ref(remote)
|
||||
merge_base = capture_command("git", "merge-base", "HEAD", remote_head_ref).strip
|
||||
abort "Error: could not resolve merge-base with #{remote_head_ref}." if merge_base.empty?
|
||||
|
||||
merge_base
|
||||
end
|
||||
|
||||
def preferred_remote
|
||||
upstream = capture_command_optional("git", "rev-parse", "--abbrev-ref", "--symbolic-full-name", "@{upstream}").strip
|
||||
upstream_remote = upstream.split("/").first unless upstream.empty?
|
||||
return upstream_remote if upstream_remote && !upstream_remote.empty?
|
||||
|
||||
remotes = capture_command("git", "remote").lines.map(&:strip).reject(&:empty?)
|
||||
abort "Error: no git remotes configured; pass baseline=<ref>." if remotes.empty?
|
||||
|
||||
remotes.include?("origin") ? "origin" : remotes.first
|
||||
end
|
||||
|
||||
def remote_default_branch_ref(remote)
|
||||
symbolic = capture_command_optional("git", "symbolic-ref", "--quiet", "refs/remotes/#{remote}/HEAD").strip
|
||||
if symbolic.empty?
|
||||
fallback = "#{remote}/main"
|
||||
capture_command("git", "rev-parse", "--verify", fallback)
|
||||
return fallback
|
||||
end
|
||||
|
||||
symbolic.sub("refs/remotes/", "")
|
||||
end
|
||||
|
||||
def with_temporary_worktree(ref:)
|
||||
temp_root = Dir.mktmpdir("coverage-baseline-")
|
||||
worktree_path = File.join(temp_root, "worktree")
|
||||
|
||||
run_command("git", "worktree", "add", "--detach", worktree_path, ref)
|
||||
begin
|
||||
yield worktree_path
|
||||
ensure
|
||||
system("git", "worktree", "remove", "--force", worktree_path)
|
||||
FileUtils.rm_rf(temp_root)
|
||||
end
|
||||
end
|
||||
|
||||
def capture_command(*command, chdir: Dir.pwd)
|
||||
stdout, stderr, status = Dir.chdir(chdir) { Open3.capture3(*command) }
|
||||
output = +""
|
||||
output << stdout unless stdout.empty?
|
||||
output << stderr unless stderr.empty?
|
||||
abort "Error: command failed: #{command.join(" ")}\n#{output}" unless status.success?
|
||||
|
||||
output
|
||||
end
|
||||
|
||||
def capture_command_optional(*command, chdir: Dir.pwd)
|
||||
stdout, stderr, status = Dir.chdir(chdir) { Open3.capture3(*command) }
|
||||
return stdout if status.success?
|
||||
return "" if stderr.include?("no upstream configured") || stderr.include?("is not a symbolic ref")
|
||||
|
||||
""
|
||||
end
|
||||
|
||||
def standardrb_command(*extra_args)
|
||||
["bundle", "exec", "standardrb", *extra_args, *LINT_TARGETS]
|
||||
end
|
||||
|
||||
def run_standardrb(*extra_args)
|
||||
run_command(*standardrb_command(*extra_args))
|
||||
end
|
||||
|
||||
def run_command(*command)
|
||||
abort "Error: command failed: #{command.join(" ")}" unless system(*command)
|
||||
end
|
||||
Loading…
Reference in a new issue