Hacker News new | past | comments | ask | show | jobs | submit login
Bitbucket Pipelines Beta: continuous delivery inside Bitbucket (blog.bitbucket.org)
190 points by Fenntrek on May 24, 2016 | hide | past | favorite | 66 comments



This is great. While I use only travis for build/testing at the moment, I really appreciate a real competition between GH / GL / BB. Users of all the platforms win because of this.


BB offering free unlimited private repos and CI steps up the competition a for sure.


Don't forget VSTS either: https://www.visualstudio.com/products/visual-studio-team-ser...

Coming from a non-microsoft background (github + jenkins + travisci) I have been pleasantly surprised. Full disclosure, I currently work at M$.


This is pretty much identical to how Gitlab does it. I was hoping they would take it at least one step further not just copy it.


What could have been made better?


At GitLab we announced pipelines two days ago https://about.gitlab.com/2016/05/22/gitlab-8-8-released/

Some of the ideas we're working on to improve it further:

- deployment environments (acceptance, pre-prod, prod)

- manually confirming deployments for production

- review apps deployments of feature branches


I wrote the npm/npm Enterprise integration for pipelines:

http://blog.npmjs.org/post/144855273708/announcing-npm-for-a...

I was really impressed; It's really slick having the source-control/collaboration and CI/CD so tightly integrated.


Sorry if this sounds like a newbie questions, but can I use this to run my test suite every time someone pushes to a feature-branch and/or before anything is merged to master?


Yes. You can run any test framework that runs inside a Docker container. And you can specify different "steps" for different branches (either by name or by globbing).


This page has a nice diagram and explains how to configure different pipelines for different branches: https://confluence.atlassian.com/bitbucket/configure-bitbuck...


It's too bad this is git only for now. Is there any plan to add support for mercurial repositories?


For the beta, only Git repositories are supported. We have plans to support Mercurial in the future.


Great to hear. Looking forward to trying out Pipelines once hg support is ready.


Oh. Well shit. I signed up for the beta without realising this.

At this point, the only thing keeping me on BitBucket instead of GitLab is Mercurial support.


Serious question... are you using HG because of history, or do people actively choose HG over Git for new projects still?

Have used both, they are so close, seems "odd" to go for the way less popular one, barring you have a really old HG repo and haven't bothered to switch.


I prefer mercurial over git and will choose it every time for new projects unless there's some other concern preventing that.

Mercurial has a number of features that git never implemented - in particular revsets. I also prefer the hg CLI over the git CLI. Mercurial has sane, concise online help, and a lot of work went into the design of the command-line to be consistent, composable, and made of pieces that do one thing and do it well.


Seconding this sentiment. Far, far, prefer hg to git. TortoiseHg blows away the Git UI's as well, IMO. I always feel like i'm poking around in the dark on SourceTree.


I do, assuming its just for me. I still much prefer the hg cli over git.

But, I'll grant its not worth fighting over if I am working with collaborators who have a git preference.


I really like to ask, if someone has Git preference, or is Git the only thing they know, enforcing VCS === Git point of view on everyone around...


Kasey, it is "origin/master" or "origin master"? Ha ha


Yes, many of us choose hg. Its development is lively, it keeps innovating, it's pleasant to use. Vive la résistance!


Interesting, I wonder if they always planned to launch their beta today or whether it got expedited after Gitlab's announcements over the last couple of days.


I can give some more context on our launch. Today is the start of AtlasCamp, our annual developer conference, in Barcelona. We planned the launch on that date a while ago because it's the best time for us to share this exciting news.

We've always been invested in the CI/CD market (Bamboo has been around a long time) and Pipelines was just making sense for us to do to help all Cloud teams to build great software.

Sten, Bitbucket Pipelines Product Manager


Didn't realise today was AtlasCamp, I can see why you would have chosen it for the news.

Congrats on the beta launch, the more competition there is, the better for everybody!


Thanks! It's an exciting time for us and we're just getting started.


What about Continuous Integration? Though I like the many options we now have to easily set-up some CI, a lot of entreprises still rely on old-fashioned on-premise CI. I can only wonder about the impact of depreciating Bamboo Cloud and what to use next.


I'm one of the Developer Advocates at Atlassian with a focus on the CI/CD space.

For "old-fashioned on-premise" CI/CD, Bamboo Server is still a solid offering from Atlassian, with active development on new features and support for existing ones. Discontinuing Bamboo Cloud is more about being able to "right-size" our cloud offerings so Atlassian can offer a CI/CD service for a team's first microservice deployed into AWS Elastic Beanstalk, and that scales up without overhead to many services each with many instances in a more complex environment like AWS ECS. And not just for AWS but for Azure, Google, Heroku, or whatever your choice of cloud platform. I believe Bitbucket Pipelines will be that next generation solution, while Bamboo will continue to serve on-premise needs for many years to come.


Nice to see you answering tehre :)

With Bamboo Cloud you were able to set up a pretty convenient "intermediate" solution, with Bamboo Cloud + an agent on your servers. Will it still be possible with Pipelines?

Also I couldn't find the doc for aggregating tests results.


Not so much. One of the things that I think makes Pipelines better suited for cloud is that it's agent-less. But that does mean there's no option to run an agent on-premise to bridge pipeline execution. Indeed, if you are accustomed to Bamboo, you are likely to find Bitbucket Pipelines rather minimalist. For example, there is currently no facility for aggregating test results.


How does this compare to shippable? Specifically, does it build docker containers which can be pushed to a registry (e.g. google's) and then deployed using Kubernetes into google cloud?


We don't support yet running Docker commands as part of the Pipeline [1] however it's something that we will be looking into in the future.

[1] https://confluence.atlassian.com/pages/viewpage.action?pageI...


Curious if there's going to be an on-prem version of this? We run Bitbucket at my company and Jenkins Enterprise as well. How does Bitbucket Pipeline compare to Jenkins Pipeline (https://jenkins.io/solutions/pipeline/)?


(Disclaimer; I'm an Atlassian employee.)

Bamboo is still the recommended solution for on-premises installations. The requirements and practicalities of OP vs cloud CI are different enough that they warrent different approaches.

That said, Bamboo supports scaling builds using AWS, and has 1st-class support for Docker-based build/test setups. I gave a talk on this at Atlassian's Summit last year if this sounds useful: http://summit.atlassian.com/videos/build/docker-continuous-i...


I really want a way to configure bamboo via a text configuration file though. We're using it now, but we have to copy the configuration from build job to build job and each job ends up subtly different over time. :( Plus I really want a way to say here are the steps to deploy to a server, now run those steps against these three servers. It would be even better if each server could potentially have it's own ssh key for deployments to prevent a hacker from using bamboo to access all the other servers.


You're not alone: https://jira.atlassian.com/browse/BAM-1223

  Dear Atlassian: This request has now been open for 9 years and has 247 votes.
  
  9 years.


I think that's where Bitbucket Pipelines comes into play. Their YAML file is similar in concept to TravisCI and others. Jenkins Pipeline has a jenkinsfile which is also similar in concept but is based on Groovy. While there's a learning curve for that, I would argue it's definitely more powerful.

https://confluence.atlassian.com/bitbucket/configure-bitbuck...


I'm not so sure about that, first off the pipelines feature doesn't exist on Bitbucket Server and Atlassian recommends you use Bamboo there. Secondly, that doesn't help with managing secure access to production servers with potentially sensitive data. You'd just be moving from storing all the passwords/ssh keys in bamboo to storing them all in your version control system.


Am I the only one that can't see the post at Atlassian blogs?

It's saying 'Sorry, that post was not found.'


The page takes 53 seconds to load for me. Initial shell loads quickly, but the content does not appear for almost a whole minute. (Chrome on OS X with all extensions off.)

This doesn't help improve my confident in Bitbucket, which is already pretty low based on past uptime: https://statusgator.com/services/bitbucket



Based on the video, this appears to require Docker [1]

I can only with for support of FreeBSD/Jails.

[1] https://www.youtube.com/watch?time_continue=122&v=p5KgjeZB8W...


I think they're saying they run the pipeline builds using docker containers. It's how they can build easily based on any branch. Ie. what they use behind-the-scenes, not what you need to be using. Although I'm somewhat doubting myself now...


That's correct. Bitbucket Pipelines uses Docker as an execution environment for builds. However, if the goal of your build is to produce a container, whether Docker or otherwise, Bitbucket Pipelines cannot do that at this time.


What about multi-language repositories? If I have Java + Ruby, how will Pipelines work?


It uses Docker to manage the installed dependencies, so you can build your own Docker image with whatever you need and then use that in the pipeline configuration: https://confluence.atlassian.com/bitbucket/use-docker-images...


You mean I should build a Docker image with both Java and Ruby? The image would be much larger than intended in that case. In production, I would have two different Docker images.


At this time, it is a limitation of the beta that Bitbucket Pipelines can only associate a single image with your build. Avoiding the complexity of chaining images helped us to ship more quickly. I expect this will change before GA. For now, you would have to effectively "merge" the Dockerfiles your self and publish the image so Bitbucket Pipelines can pull it.


It is unfortunately common to ship Docker containers to production that contain the entire tooling suite to build the thing being deployed, and it sounds like that might be what is happening in your case?

If I understand right the thing you're expected to provide to run inside "pipelines" is a container in which your build can be performed; the output of that build might also be a Docker container, but hopefully there is not a requirement that it be the same container in which you are performing the build. Or you might be shipping something completely different, totally unrelated to Docker, as your build output that gets sent onward to production. Of course all of this is just a guess, it will become more clear as some of us start to enter the beta program.


Kyle, this is correct. We use containers simply to create the environment in which we'll execute the script commands in isolation. You can start with a small container and install dependencies during the run or you can prepare a bigger container with the dependencies installed already.


Does the pricing of this feature disclosed? I couldn't find it.


Pipelines is free during beta [1]. The post-beta pricing has not been announced.

[1] https://confluence.atlassian.com/display/BITBUCKET/Bitbucket...


Hi,

I'm one of the Product Manager on Bitbucket Pipelines. The beta is indeed free, with a limitation on the number of minutes per user per month (starting at 300mins/user/month but that may change during the beta).

We haven't decided on the post-beta pricing yet. The beta will help us understand better how our customers are using it so that we can price it accordingly. We're leaning towards a model that scales well with the number of users on an account.

Let me know if that helps.

Thanks,

Sten Pittet


Do you have some details on the kind of hardware Bitbucket Pipelines will run, as that can affect the time it takes to run a workflow? More specifically, how many cores and how much memory will the containers have access to?


We will be experimenting different configurations during the beta to find the right fit. Some details about the resources available are published in our documentation [1] and will be updated as we move forward.

[1] https://confluence.atlassian.com/pages/viewpage.action?pageI...


I got beta access, created a custom docker image and got my tests to run however lots of them are failing because I need redis. Is there a way to add Redis server?


Currently not inside pipelines. You could create your own redis and connect to that. Best option is to stub/mock redis in the tests.


Add Redis to your custom docker image?


Will Pipelines remain free for Bitbucket Cloud users?


It's FREE during the beta. We will have more information to share about pricing closer to the general availability time.


Well, here is to hoping it will remain free after the beta.


Yay, and once again us Google App Engine users are shafted from having good CD support, especially when we use Bitbucket as our repo host.


While we don't have the "recipe" documented in an example repository, GAE does have right knobs and levers that would make it possible to use Bitbucket Pipelines with GAE. Give it a try! And, even if we haven't made an example repository yet, don't rule it out.


I'd love to, I just have no idea where I would even start. Any advice?


I've written some of the code for other deployment targets. One of my earliest was for S3: https://bitbucket.org/ian_buchanan/pipeline-example-aws-s3

So, the first thing to check is for simple REST APIs that you can curl. If I recall correctly, GAE is tricky because it takes more than 1 API call.

Next, check for a CLI or script library. I see Google provides an SDK that might work: https://cloud.google.com/appengine/downloads#Google_App_Engi...

In that case, we have to pull the library into an appropriate image. For example, here's how the Amazon folks solved for S3: https://bitbucket.org/awslabs/amazon-s3-bitbucket-pipelines-...

I won't have time to look at GAE more closely until next week, but hit me up on Twitter if you want to DM me: @devpartisan


My Travis config deploys to App Engine. Have a look at these two files:

https://github.com/GoogleCloudPlatform/golang-samples/blob/m...

https://github.com/GoogleCloudPlatform/golang-samples/blob/m...

Those might be useful for configuring Bitbucket Pipelines.


My colleagues have pointed out the simple answer is: Anything you can do with Bash, you can do in Bitbucket Pipelines.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: