2

I have a small Storybook project that is hosted on Gitlab. I use Gitlab pages to host the storybook. The project builds at folder called storybook-static and then the files from this folder are served through gitlab pages.

I'm trying to make the pipeline so I can have three subfolders: latest, nextandtest` as follows:

  1. When tag is created, the build files are moved to public/latest
  2. When MR is merged, the build files are moved to public/next
  3. When manually triggered, the build files are moved to public/test.

The three subdirectories co-exists, so updating public/latest does not affect public/test

.gitlab-ci.yml:

image: node:16-alpine

stages:
  - build
  - deploy

cache:
  paths:
    - node_modules/

before_script:
  - npm ci --legacy-peer-deps

build-and-deploy-latest:
  stage: build
  script:
    - npm run build:storybook -c .storybook
    - rm -rf public/latest
    - mkdir -p public/latest
    - cp -r storybook-static/* public/latest
  artifacts:
    paths:
      - public
  rules:
    - if: '$CI_COMMIT_TAG'

build-and-deploy-next:
  stage: build
  script:
    - npm run build:storybook -c .storybook
    - rm -rf public/next
    - mkdir -p public/next
    - cp -r storybook-static/* public/next
  artifacts:
    paths:
      - public
  rules:
    - if: '$CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_EVENT_TYPE == "merge"'

build-and-deploy-test:
  stage: build
  script:
    - npm run build:storybook -c .storybook
    - rm -rf public/test
    - mkdir -p public/test
    - cp -r storybook-static/* public/test
  artifacts:
    paths:
      - public
  when: manual
  only:
    - merge_requests

The pipelines are triggered on the right events. I can see from the logs that the build is successful. When I browse the artifacts, I can see the files ( ex. public/latest/index.html ) exists. When I download the folder with the artifacts and spin static server locally, it serves the index.html.

But opening the gitlab pages /next , /latest or /test gives me 404. I waited a day ( in case if it was cache, or runner needed time, etc ) and I still see 404s.

1 Answer 1

1

You can only publish one and exactly one pages site per project. Every time you upload the public artifact for the pages site, it completely overwrites and removes any existing content on the Pages site for the project. For example, if you run the pages upload job in two different branches, only one can exist at a time on the Pages site, which is probably why you're getting the behavior you notice.

So, only the contents of one of your job's artifacts will ever be reflected in the gitlab pages site at any one time.

Also note that GitLab expects the job is named pages: for the Pages upload to work.

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks, I suspected something like this happened. I want to have three subdirectories ( technically still one page ) Is it possible to do a pipeline that copies the current content of the public dir ( whole public with /next, /latest and /test ) inside, then override just single directory with the artifacts?
@ChristoPanayotov the only way would be if you have a single job to do the pages upload and that single job will need to create (or otherwise get) the full and complete contents (/next, /latest AND /test) ALL in the same job every time it runs.
Thank you, I'm going this direction.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.