Skip to content

[FIX] Lychee failing action#352

Open
dansysanalyst wants to merge 5 commits intopestphp:4.xfrom
dansysanalyst:fix_link_checker
Open

[FIX] Lychee failing action#352
dansysanalyst wants to merge 5 commits intopestphp:4.xfrom
dansysanalyst:fix_link_checker

Conversation

@dansysanalyst
Copy link
Copy Markdown
Contributor

Hi @owenvoke,

This PR updates lychee.toml according to the example proposed in the actions feedback https://github.com/lycheeverse/lychee/blob/lychee-v0.23.0/lychee.example.toml

PS: Sorry for the goofy branch commit.


# Custom request headers
headers = []
header = { "accept" = "text/html", "x-custom-header" = "value" }
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need a custom header here. 👍🏻

Suggested change
header = { "accept" = "text/html", "x-custom-header" = "value" }
header = { "accept" = "text/html" }

############################# Runtime #############################

# File to read and write cookies
cookie_jar = "cookie-jar"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's comment out bits that aren't relevant for us.

Suggested change
cookie_jar = "cookie-jar"
# cookie_jar = "cookie-jar"

default_extension = "md"

# GitHub API token
github_token = "secret"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
github_token = "secret"
# github_token = "secret"

github_token = "secret"

# Resolve directories to index files
index_files = ["index.html"]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
index_files = ["index.html"]
# index_files = ["index.html"]

index_files = ["index.html"]

# Preprocess input files
preprocess = { command = "preprocess.sh" }
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
preprocess = { command = "preprocess.sh" }
# preprocess = { command = "preprocess.sh" }


# Remap URI matching pattern to different URI.
remap = [ "https://example.com http://example.invalid" ]
remap = ["https://example.com http://example.invalid"]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
remap = ["https://example.com http://example.invalid"]
# remap = ["https://example.com http://example.invalid"]

@dansysanalyst
Copy link
Copy Markdown
Contributor Author

Hi @owenvoke,

Thanks for the feedback. I have simplified the GitHub action by using inline arguments and removing the config file.
I've also fixed a broken link to have the check passing again.

Greetings and thanks

Copy link
Copy Markdown
Contributor

@owenvoke owenvoke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Just one more query.

lycheeVersion: latest
format: markdown
args: >
--base https://pestphp.com
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just want to check, this doesn't actually scan broken links on the live site, it checks the local docs?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I understood the question, but the base argument is used to resolve the local files as links.
[only()](/docs/filtering-tests#only)https://pestphp.com/docs/filtering-tests#only

If we have a broken link on the home page, it will not be detected, as this action only checks the .md doc files.

If you see fit, we could probably set up ScholliYT/Broken-Links-Crawler-Action scanning the live website (likely in the pestphp/pestphp.com repo). But it has always seemed to me that the website does not change so often that we risk having broken links outside the docs.

My original motivation with this action some years ago was to avoid broken links to external resources (e.g., Laravel and PHPUnit docs) when features are renamed/moved. If I remember well, there were a couple of cases in 2x doc revision.

Copy link
Copy Markdown
Contributor

@owenvoke owenvoke Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, that's all good then! 👍🏻 I just wanted to make sure it was checking local files, and not the actual remote.

This LGTM. However, I'm no longer a Pest team member, so my approval won't lead to a merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants