27

I have a project using Git LFS in Azure Repos with several binary image files being checked in using Git LFS. When my Azure Pipelines build performs a git pull, the image files are not pulled from Git LFS and I am left with several zero byte image files.

I'm using a custom self-hosted Azure Pipelines build server which has a recent version of Git LFS installed on it:

PS C:\rehan> git lfs --version                                                                                     git-lfs/2.7.2 (GitHub; windows amd64; go 1.12.2; git 08a08ae0)

I've tried adding steps to perform a git lfs install but that doesn't help. When I manually perform a git lfs pull after logging on to the build server, the files are downloaded correctly. When I run git lfs pull as a build step in my Azure Pipeline, I get the following error:

fatal: could not read Username for 'https://foo.visualstudio.com': terminal prompts disabled
batch response: Git credentials for https://foo.visualstudio.com/Bar/_git/Bar not found.
error: failed to fetch some objects from 'https://foo.visualstudio.com/Bar/_git/Bar.git/info/lfs'
Downloading LFS objects:   0% (0/1), 0 B | 0 B/s                                
##[error]PowerShell exited with code '1'.

5 Answers 5

65
+100

You have to use https for lfs to work with Azure Devops and you have to do LFS checkout when doing the builds:

steps:
- checkout: self  # self represents the repo where the initial Pipelines YAML file was found
  lfs: true

if you are using a UI wizard there is a checkbox to checkout lfs

https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema#checkout
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops#checkout-files-from-lfs

Sign up to request clarification or add additional context in comments.

4 Comments

This checkout step worked for me but HTTPS was not necessary. I am using Azure DevOps 2019 where the server and build agent are on the same Windows domain.
I am experiencing the same issue as the OP, and have tried setting lfs: true on the checkout step, but I am still (albeit intermittently) missing the lfs files (they are just text files with an SHA). I wonder if there is a way to put a git trace on the build machine to see what the git lfs hook is doing.
you could spin up a vm and onboard it as a self-hosted agent and do tracing there. alternatively there are some environment variables that allow for debug output. have you tried: system.debug and\or Enable system diagnostics
The agent parses this field as a string, and does not verify if it was actually set as such. When it is cast to string after parsing the boolean becomes null, so the field is ignored. This might be the reason it's not working for some people. Or it stopped working at some point with an update to YamlDotNet, sadly the Agent repo does not contain the info which version they use, so also no info on updates.
6

Unfortunately @4c74356b41's answer did not work for me. I have the same issue as @Haroon:

I configured the checkout step with lfs but the files stayed text files with hashes.

The solution was to manually run git lfs fetch and git lfs pull

  steps:
  - checkout: self
    lfs: true

  - script: |
     git lfs fetch
     git lfs pull
    displayName: git-lfs

1 Comment

You can also just delete the broken source that has been checked out (including the .git folder) and then re-run the pipeline (after adding lfs: true)
3

I'm going to add this answer promoting Naxin's comment from May 2023.

The solution that worked for me was the succinct,

You can also just delete the broken source that has been checked out (including the .git folder) and then re-run the pipeline (after adding lfs: true)

Comments

0

In the case where you have a submodule that uses LFS, you can do the following:

    steps:
    - checkout: self
      submodules: true

    - task: PowerShell@2
      displayName: 'Git LFS checkout'
      inputs:
        targetType: 'inline'
        workingDirectory: '$(build.sourcesdirectory)\[submodule]'
        script: |
          git lfs install
          git config --global lfs.fetchinclude "*"
          git config --global lfs.fetchexclude ""
          git config --global lfs.concurrenttransfers 1
          git config --global lfs.skipSmudge true
          git lfs pull
          git lfs checkout
        failOnStderr: true

This will make sure that the files are completely downloaded before moving on. That way you don't end up trying to utilize 1KB pointer files.

Comments

-3

I think that the error is pretty straight forward. You haven't provided git credentials in your pipepline.

And more important, can I ask why are you using git for binaries? How do you intend to have version control over something that git doesn't understand? By that I mean, how do you intend to use features like diff and merge on binary files?

3 Comments

I've never had to explicitly call git lfs pull in other repos. I don't get why I have to do it here and if so, how to pass credentials to it. That's what Git LFS is for. It's common for a lot of repos to contain binary files like images and the like.
Using SSH to authenticate on your LFS repo will do the thing. There is a good Gist on Github that explains how to authenticate on Git LFS github.com/git-lfs/git-lfs/blob/master/docs/api/…
I'd be interested if there is some Azure Repos/Pipelines specific info we're missing for how to do this.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.