|
Bernhard Hiller wrote: What are your experiences with introducing CI?
Never used that term (which is why I initially skipped the question), but have introduced automated builds[^] and testing in three companies. Nothing fancy, motivated to create it by the Spolsky Test[^].
The batch-file gets the latest version of the source-code, builds it, runs the automated tests, runs FxCop, puts the output in a file and mails that every day. Set up once, easy to maintain, no additional costs (except the CPU-time) - hence, very little reason not to do it.
At first, the batchfile would run around midnight, and we'd read the mails in the morning. Currently, it's running at eight O'clock, with the report coming in an hour later. A short list of benefits;
- The code is never "broken" for more than 12 hours
- Always ready to deliver a "production build"
- More confidence in the code-base (none of the external consultants wrote sh*t that breaks my code)
- Better maintenance of the tests (since they're run every day)
- Statistics on the software; 500+ successful tests every day and the blessing of FxCop are an indication of the quality of your code.
Bernhard Hiller wrote: a lot of money could be wasted, and it's better to get a consultant for that...
..I'm almost considering on playing the consultant and to sell the batchfile with a random cool name from the BuzzWord-forum
Nah, guess it would be better to have some Rent-A-Coder write a GUI for it first
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
Thanks for your hints. "Continuous integration" is the common term for that process, see also Wikipedia: http://en.wikipedia.org/wiki/Continuous_integration[^] which lists a number of tools.
The Spolsky test is an interesting idea. Our score is presently about 1. That's the environment where we have to start...
|
|
|
|
|
Bernhard Hiller wrote: which lists a number of tools.
Quite the list; I should be glad I did not waste a Rent-A-Code on that
Bernhard Hiller wrote: Our score is presently about 1. That's the environment where we have to start...
Is the environment willing to change? If yes, then they might be doubling their productiveness soon.
Good luck
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
If you want to setup a free CI solution for yourself then you will have to spend some time on reading its documents for the first 1-2 times. I myself played around with buildbot (python) that I hated, and with Hudson CI (java) that is very nice. A bat file is unfortunately hard to monitor (no web interface to look at with a browser), and hard to setup automatic test on submit to your version control system. Currently we use a custom CI server (written in C++), its pure and simple, knows exactly only those few features I listed below in my other post. Over a year we found out exactly what we need and none of the existing solutions provided all of them plus they had a feature bloat we don't need.
|
|
|
|
|
pasztorpisti wrote: A bat file is unfortunately hard to monitor (no web interface to look at with a browser)
ECHO "Starting unit-test" >/var/www/CIprogress.htm
Custom applications are hard to customize, and a batch-file is easily extended. Anyone who can write a console-application using VB.NET can extend it's functionality. I'll put you a progress-bar in a tray-icon to follow the progress of the batch-file, if required
pasztorpisti wrote: Currently we use a custom CI server (written in C++), its pure and simple, Simpeler than what I suggested?
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
How do you start the script? A CI system can be started from any machine from a browser. You cant start it multiple times by accident. You have a nice page with colors with a list of successful/failed builds. The build starts automatically on submit. You can implement most of these from batch files, but these batch files become uglier then C++ code quickly! Another reason I usually prefer a single binary solution is that script systems (especially heterogeneous ones) are much more error prone.
|
|
|
|
|
pasztorpisti wrote: How do you start the script?
Like any other, although the Windows Task Scheduler might be handy. Or make it a post-build option from the IDE. It's as easy as starting any GUI-less executable.
pasztorpisti wrote: A CI system can be started from any machine from a browser.
The argument that it does builds for multiple platforms would have been better. It's relative easy to start an executable using ASP.NET.
pasztorpisti wrote: but these batch files become uglier then C++ code quickly!
..I do not see how anyone could keep a tree with source-code tidy, if they can't even do so with a simple script.
An off-the shelf product does have advantages, you're right. And yes, it does provide some features that would cost some time (and thus money) to implement if you need them. And yes, for larger teams it is recommended to go for an existing product. Especially since no-one will be responsible for maintaining the script, and it'll grow into a number of hacks from various employees. The most decisive limitation here would be the budget.
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
Choose whatever option you like, and please try them before you make suggestions and tell your opinion. We tried scripts, prepackaged product, and the current option and now we are happier than ever. Choose the option that is suitable for you and minimizes the maintenance work you need to invest! Good luck!
|
|
|
|
|
Our builds are quite complicated, multiplatform, multi-machine, and sometimes even massive data processing is split to separate machines. If your build is a single machine build with a single log file as an output your problem is much easier to solve.
Here is what we find useful in a CI system:
- A web interface through which you can check the result of builds, the log files, and a button which can be used to start or queue up a build.
- scheduled builds
- builds that are triggered automatically when someone uploads new code to your version control system
- sending mails about the results to a few fixed addresses and to those who submitted the latest changes in the version control system since the last successful build (blamelist).
automated tests are not in my list of important things because the config file of our build system is just a set of tasks that are in a dependency graph, each task is a list of commands to execute on one of the machines. automatic testing is just plus one command after running the commands that create the build. Storing a copy of each build to a storage machine is also just an additional command that runs a python script to do the job.
Some companies hire a build engineer for this task, you can also do that if build maintenance is very important. Actually we found CI very useful, especially since our main development platform is Windows and we want automatic tests immediately after submitting C++ code the version control system and it has to compile with visual C++, g++, and llvm compilers as well. Data errors are also often caught by CI.
|
|
|
|
|
Thanks for your comments.
Also I find the points you mentioned useful. But I'd include some automated tests (actually lots of regression testing required, but who wants to do them...).
|
|
|
|
|
As I mentioned the build system basically does nothing but runs commands on different machines. An automated test is just a command to run for the build system that can be displayed separately on the web interface. Don't overcomplicate the build system like I did first time! It just runs your commands, it shouldn't know about your development environment, tests, and so on... If you want something complicated then run a script from your build system (I use python for that). I put a command into the config of our build system if I want to see it in a separate "box"/log file on the web interface, otherwise I just put a set of commands to external script files (if its ok to put their output to the same log file).
|
|
|
|
|
If you are using a standard language, and you already have automated tests (unit tests, and automatable system tests) which use a standard framework, e.g. JUnit and Ant, then CI is pretty easy to set up. If you're using a less conventional language like Python or Ruby you would probably need to create at least a plugin if not a system yourself.
CI is very valuable in a medium to large project, I strongly recommend you get it set up.
|
|
|
|
|
Thanks for your comments.
We are using C#. But automated tests are still o be introduced.
|
|
|
|
|
The build system is independent from the development environment you use, also from your language. I rarely see overly customized build systems (for example I saw one that took screenshots about auto-tests of an opengl software and showed it on the web interface) and it rarely worth the efforts.
|
|
|
|
|
C# is a standard language, so you should have no problem. We used TeamCity (not free) on a .Net project and everything worked smoothly.
You need to have automated tests for efficient development anyway, so hopefully you can get that done as part of the same push as introducing CI. Does that mean you don't have unit tests? In 2012 that is not really on :P
|
|
|
|
|
CI can be very valuable for small projects too, for example if you start a project that has to be run on many different architectures (PC, mac, xbox, ipad, etc...) then you probably want a dedicated development environment (lets say PC) and you want your crossplatform code to be tested immediately after submitting it to your source control system (or maybe before submitting).
Its very easy to setup CI for a small project if you did setup earlier. For a small project its usually just running a command on a machine or a few machines (one per platform). Plus setting up email etc in the config that is easy.
|
|
|
|
|
Our team used both cruise control[^] and TFS.
We also used PowerShell to automate some tasks within the build scripts.
There is a great value in implementing build automation and CI in general. Start simple and add more features to your build steps later.
Generally, automation is never a waste of time especially when it comes to repetitive tasks like builds.
Hesham A. Amin
My blog
twitter: @HeshamAmin
|
|
|
|
|
We just put online a website managed by a CMS, Joomla. We've got a test server and a production server. The question we ask ourselves is what is the correct strategy to update the website content: should we give editors the right to directly modify the production server through Joomla's web interface, or should we restrict edition to the test server (or a clone of it) and find a way to automate the transfer from test to production?
The first solution is the easiest one but seems the most dangerous (for instance how do we reconstruct the production server data if it crashes?) , the second one is safer but designing an automatic way to make the transfer is not straightforward...
So in your opinion, dear fellow Cpians, what would be the best thing to do? Any food for thought is welcome!
When they kick at your front door
How you gonna come?
With your hands on your head
Or on the trigger of your gun?
Fold with us! ¤ flickr
|
|
|
|
|
KaЯl wrote: for instance how do we reconstruct the production server data if it crashes?...what would be the best thing to do
You can't answer the second question until you answer the first. And that can only be answered by your business.
You might also ask yourself the following
- What happens if someone does an edit and wants to revert it?
- What happens if someone doess an edit and that edit itself is the cause of the crash? What if it doesn't cause a crash for two weeks?
- What if someone makes an edit that is inappropriate? How do such edits get reviewed?
KaЯl wrote: the second one is safer but designing an automatic way to make the transfer is not straightforward...
You need a process to do a production install. Whether it is automatic is a secondary issue.
|
|
|
|
|
Thanks for your input.
When they kick at your front door
How you gonna come?
With your hands on your head
Or on the trigger of your gun?
Fold with us! ¤ flickr
|
|
|
|
|
If you're talking about just the content, not the system code, I think you should update directly on production server and have a good review workflow.
the other option will be a bit hard to manage if you allow site users (not editors) to post comments for example. How will you manage the data then?
|
|
|
|
|
Indeed, it's the choice we made. Editors have to be responsible, they aren't children anymore, are they?
And if they mess with the servers, then I should still have my whip somewhere, if they deserve a little punishment...
When they kick at your front door
How you gonna come?
With your hands on your head
Or on the trigger of your gun?
Fold with us! ¤ flickr
|
|
|
|
|
I am about to start developing a new website, and I am a little stuck as to the correct architecture to use to achieve what I need.
I have been given access to a data feed, which provides me with live, real time data. I need to cache this data and do some processing on it, and I can only create one connection to this service - it will block any subsequent attempts from the same IP. Therefore directly connecting via a website is not going to be feasible.
It looks like I will need to create two separate parts to this project - a back end service that retrieves the data and performs any processing required, then a separate website that accesses the processed data. As the data is real time - the ability to update the web page in real time would also be desirable. As well as this, I also anticipate adding a few mobile apps in future, which will also need to connect to the back end system.
Does anyone have any suggestions on what technologies to use? Obviously Json would be ideal for the inter-process communications, but what kind of back end technologies can I use that easily supports json while still allowing a persistent service that can maintain a permanent connection to the data source?
Anyway, i'm rambling a bit. I hope it's clear, and I look forward to people's advice.
|
|
|
|
|
Member 849873 wrote: the ability to update the web page in real time would also be desirable. As well as this, I also anticipate adding a few mobile apps in future, which will also need to connect to the back end system.
Depends on your definition of "real time" but humans don't operate in "real time" anyways. So for example it is pointless to attempt to update a GUI 5 times a second just because the data is received that quickly.
Member 849873 wrote: can I use that easily supports json while still allowing a persistent service that can maintain a permanent connection to the data source?
You need requirements and an architecture design before deciding on "technologies".
For example the following questions need to be answered
- Exactly how fast does this data arrive?
- Does it need to be retained (persisted)? If so for how long?
- The user apps do what with the data? Graph it? Scroll a bunch of numbers? What?
- How many users will there be? And then ask this again REALISTICALLY how many users will there be?
- Based on the above information what are realistic long term volume needs? This includes storage and network.
- Does the data feed stop/slow? Or specifically how do you detect if the connection has stopped receiving data? Additionally does the source allow you to remain connected for long periods of time? (It might require a reconnect or a heart beat message.)
- How will admin occur? For example if the connection goes down does someone need to be notified? If so how? If the main site goes down does someone need to be notified? How will you know of the main site went down? (Obviously these could be answered by waiting for the users to complain but that might not be ideal.)
|
|
|
|
|
I was going to make the same comment in regards to the "real-time" aspect of things... it has a broad range of meanings depending on context.
For DSP engineers, real-time is generally hardly feasible on a desktop (depending on rates), so there's hardly any point in trying to refresh anything a human will "see" and interpret in that context.
|
|
|
|