(1) Discuss the importance of maintaining the quality of the code, explaining the different aspects of the code quality



Quality of code is something that is really hard to measure, and every person has different opinion about what quality means.
Also there is a myth, that there is a “good quality” code and “bad quality” code. There is no straight line, dividing those two and it’s programmer’s job, to make sure, there is a good a ratio between bad and good code. You can’t have a codebase that is most optimal and elegant on every line. It’s counter productive to write that way (even if it’d be possible), because sometimes you have to make your program less elegant, in order to maintain some level of simplicity and readability.
As you have your model, how you’re application should work, there are different areas of code. Some areas are more abstract and some are very specific.
For example a generic module through which you’ll perform async request, has to be more flexible and thus more abstract and generic, so you can use it’s API across the whole app.
Other parts, like specific component rendering some very specific content. For example a calendar. You know in great detail, what kind of data flows through your functions, so you can adjust those function, in very literal manner. Those modules aren’t very reusable, but they’re in the end create the final result of your app.
In areas more abstract, you can do some complicated operations and lose some of readability, because you won’t have to read this module, in order to work with it. Only thing that has to be somewhat clean and simple is it’s public API.
However on places, where you’re expecting a lot of work to be made, and places where you’re expecting some routine work to happen, you should aim for simplicity more than elegance and “smartness”.

Code quality is a loose approximation of how long-term useful and long-term maintainable the code is.

Code that is thrown away tomorrow: Low quality.

Code that is being carried over from product to product, developed further, maybe even open sourced after establishing its value: High quality.


Since looking into the future can be somewhat tricky, we look at the present signs that may help predicting it.

With code it translates to:
  • Clear and understandable design and implementation.
  • Well defined interfaces.
  • Ease of build and use.
  • Ease of extensibility.
  • Minimum extra dependencies.
  • Tests and examples.
  • Documentation, better yet -- self-explaining code.
  • Up to date means to contact the developer.






(2) Explain different approaches and measurements used to measure the quality of code  


Quality is one of the four key project constraints, which need to be planned and controlled by a project manager during the entire project lifecycle. In order to plan and control it the PM, first of all, has to understand how to measure it.




Depending from circumstances, you can use different techniques to evaluate the quality of a software product:
  • Completness: Which part of the needed features are actually implemented
  • Asking users: What is the feeling of typical users about the software?
  • Metrics: Some metrics can give you a good idea about the quality of the code 
  • Process: The use (or not) of certain processes is a good hint about the quality of a development process. Bugtracking, automated tests, versioning tools...
  • Bug detection: As BillThor explained, bug detection rate is a good indicator







(3) Identify and compare some available tools to maintain the code quality 



Veracode Static Code Analysis Tool

Veracode is a static analysis tool which is built on the SaaS model. This tool is mainly used to analyze the code from a Security point of view.
This tool uses binary code/bytecode and hence ensures 100% test coverage. This tool proves to be a good choice if you want to write secure code.

RIPS code analysis
RIPS is the only code analysis solution that performs language-specific security analysis. It detects the most complex security vulnerabilities deeply nested within the source code that no other tools are able to find.
It supports major frameworks, SDLC integration, relevant industry standards and can be deployed as a self-hosted software or used as software-as-a-service. With its high accuracy and no false positive noise, RIPS is the ideal choice for analyzing Java and PHP applications.

logo-pvs
PVS-Studio is a tool for detecting bugs and security weaknesses in the source code of programs, written in C, C++, C# and Java. It works in Windows, Linux, and macOS environment.
It is possible to integrate it into Visual Studio and other widespread IDE. The results of the analysis can be imported into SonarQube.

kiuwan-logo
Kiuwan is a SAST and SCA platform with the largest technology coverage and integrations in the market. With a DevSecOps approach, Kiuwan achieves outstanding benchmark scores (Owasp, NIST, CWE, etc) and offers a wealth of features that go beyond static analysis, catering to every stakeholder in the SDLC.

kritika-logo
Kritika.IO analyzes your code and provides useful information on your code style, code smells, complexity, duplications. It also analyzes open source dependencies licenses and looks for known vulnerabilities.
Kritika.IO integrates with GitHub, BitBucket and GitLab. It uses progressive pricing that depends solely on the amount of code analyzed. Analyzing open source projects is completely free and feature complete. Among unique languages, it supports Perl and Tcl.

Gamma logo
Gamma is an intelligent software analytics platform, developed by Acellere. It supports developers and teams in building higher quality software in less time, by speeding up code reviews.
It automatically prioritizes hotspots in the code and provides clear visualizations. With its multi-vector diagnostic technology, it analyses software from multiple lenses, including software design, and enables companies to manage and improve their software quality transparently.

code compare tool logo
Code Compare – is a file and folder comparison and merge tool. Over 70,000 users actively use Code Compare while resolving merge conflicts and deploying source code changes. Code Compare is a free compare tool designed to compare and merge differing files and folders. Code Compare integrates with all popular source control systems: TFS, SVN, Git, Mercurial, and Perforce. Code Compare is shipped both as a standalone file diff tool and a Visual Studio extension.
Key features:
  • Text Comparison and Merging
  • Semantic Source Code Comparison
  • Folder Comparison
  • Visual Studio Integration
  • Version Control Integration and more

reshift_just_logo
Reshift is a SaaS-based software platform that helps software development teams identify more vulnerabilities faster in their own code before deploying to production.
Reducing the cost and time of finding and fixing vulnerabilities, identifying the potential risk of data breaches, and helping software companies achieve compliance and regulatory requirements.

microfocus
Fortify, a tool from HP which lets a developer build an error-free and secure code. This tool can be used by both development and security teams by working together to find and fix security-related issues. While scanning the code, it ranks the issues found and ensures the most critical ones are fixed first.
 the best tools for Static Analysis Testing. This is slightly different when compared to other static analysis tools because of its ability to support various types of static analysis techniques like Pattern Based, Flow-Based, Third Party Analysis, and Metrics and Multivariate analysis. Another good thing about the tool is beside identifying defects it allows provides a feature which prevents defects.

Coverity Static Code Analysis Tool
Coverity Scan is an open source cloud-based tool. It works for projects written using C, C++, Java C# or JavaScript. This tool provides a very detailed and clear description of the issues which helps in faster resolution. A good choice if you are looking for an open source tool.
An automated tool which can be used to analyze more than 50+ languages works excellently regardless of the size of the project. In addition, it provides a Dashboard to users which help in measuring quality and productivity.

CodeSonar Static Code Analysis Tool
A Static analysis tool by Grammatech not only lets a user find programming error, but it also helps in finding out domain related coding errors. It also allows customizing checkpoints and also built in checks can be configured as per the requirement. Overall a great tool to detect security vulnerabilities and its ability to do a deep static analysis makes this stand out from rest of the other static analysis tools available in the market.
Understand
Just like its name, this tool lets user UNDERSTAND code by analyzing, measuring, visualizing and maintaining. This allows quick analysis of massive codes. This is one tool which is mainly used by aerospace and automakers industry. Supports major languages like C/C++, ADA, COBOL, FORTRAN, PASCAL, Python and other web languages.




(4)Discuss the need for dependency/package management tools in software development

Bower

The package management system Bower runs on NPM which seems a little redundant but there is a difference between the two, notably that NPM offers more features while Bower aims for a reduction in filesize and load times for frontend dependencies.
Some devs argue that Bower is basically obsolete since it runs on NPM, a service that can do almost everything Bower can do. Generally speaking this isn’t wrong.
But devs should realize Bower can optimize the workflow specifically with frontend dependencies. I recommend Ben McCormick’s article Is Bower Useful to learn more about the value offered from both package management tools.

RubyGems


RubyGems is a package manager for Ruby with a high popularity among web developers. The project is open source and inclusive of all free Ruby gems.
To give a brief overview for beginners, a “gem” is just some code that runs on a Ruby environment. This can lead to programs like Bundler which manage gem versions and keep everything updated.
Rails developers will love this feature, but what about frontend packages? Since Ruby is open source, developers can build projects like Bower for Rails. This brings frontend package management to the Ruby platform with a small learning curve.


 RequireJS


There’s something special about RequireJS in that it’s primarily a JS toolset. It can be used for loading JS modules quickly including Node modules.
RequireJS can automatically detect required dependencies based on what you’re using so this might be akin to classic software programming in C/C++ where libraries are included with further libraries.

 Jam


Browser-based package management comes in a new form with JamJS. This is a JavaScript package manager with automatic management similar to RequireJS.
All your dependencies are pulled into a single JS file which lets you add and remove items quickly. Plus these can be updated in the browser regardless of other tools you’re using (like RequireJS).

Browserify


Most developers know of Browserify even if it’s not part of their typical workflow. This is another dependency management tool which optimizes required modules and libraries by bundling them together.
These bundles are supported in the browser which means you can include and merge modules with plain JavaScript. All you need is NPM to get started and then Browserify to get moving.

Mantri
Still in its early stages of growth, MantriJS is a dependency system for mid-to-high level web applications. Dependencies are managed through namespaces and organized functionally to avoid collisions and reduce clutter.

Volo
The project management tool volo is an open source NPM repo that can create projects, add libraries, and automate workflows.
Volo runs inside Node and relies on JavaScript for project management. A brief intro guide can be found on GitHub explaining the installation process and common usage. For example if you run the command volvo create you can affix any library like HTML5 Boilerplate.









(5) What is a build tool? 

Build tool is,

A build tool is a programming utility that is used when building a new version of a program. For example, make is a popular open source build tool that uses make file, another build tool, to ensure that source files that have been updated (and files that are dependent on them) will be compiled into a new version (build) of a program.



(6) Explain the role of build automation in build tools indicating the need for build automation

In the context of software development, build refers to the process that converts files and other assets under the developers' responsibility into a software product in its final or consumable form. The build may include:
  • compiling source files
  • packaging compiled files into compressed formats (such as jar, zip)
  • producing installers
  • creating or updating of database schema or data
The build is automated when these steps are repeatable, require no direct human intervention, and can be performed at any time with no information other than what is stored in the source code control repository.

Expected Benefits

Build automation is a prerequisite to effective use of continuous integration. However, it brings benefits of its own:
  • eliminating a source of variation, and thus of defects; a manual build process containing a large number of necessary steps offers as many opportunities to make mistakes
  • requiring thorough documentation of assumptions about the target environment, and of dependencies on third party products





(7) Compare and contrast different build tools used in industry  



There are many options when it comes to which software or platform you decide to use for your daily tasks at work. I've decided to write a list of common tools that we use in our development department because I thought it would be helpful for everyone else out there. 
A new software takes a bit of time to adapt, get used to, and understand. We've all been in a situation where we’re not really satisfied and have to start all over again with another tool. Let’s just admit it: It’s not a great feeling!  
If you ended up on this page, it’s most probably because you know how important software development tools are and how they increase the efficiency and productivity of a team. In this list, you will find great programming tools we use at Apiumhub, as well as toold we use to make our day more efficient. So no, you won’t only find programming tools, but it’s still what we use on a daily basis as programmers!

1. Terminal


We all use our terminal. It really depends on what you are working on; some of us use it every day while others could be using it once a week. But at the end, we all use it! The terminal is a command line where you can execute processes. It’s quite useful and makes your job much faster. You can move between files, download apps, and many other things with just one command instead of going through the whole menu.

2. Tmux


When it comes to working with various open terminals in different processes, Tmux is one of the most useful tools. What is Tmux? It’s a terminal multiplexer that enables you to have several independent terminals in one terminal. It’s about dividing the window into panes or tabs within the terminal and makes it easier to move through.
If you’re interested, here’s a short Tmux cheat sheet that might come in handy.

3. Docker


Docker provides a software containerization platform that enables you to package your application or software in a filesystem. This container could be moved and executed anywhere. You will find everything that you need to run: code, system libraries, etc. This means that the software will be running the same and will not depend on its environment.
Why should you use Docker? Well, as containers have a different architectural approach, they are more efficient, they can run on any computer, on any infrastructure, and in any cloud. Docker enables you to spend more time on creating new features, fixing issues, and shipping software. It also makes it easier to collaborate among developers and system admins because it creates a common framework. Finally, Docker permits you to ship and scale the software faster. 

4. Intellij


Intellij is an Integrated Development Environment. An IDE integrates all the tools that you need to develop a platform, so it has a code editor, a compiler, a debugger, etc. What we like about Intellij is that it has auto-completion and it’s really user-friendly and therefore easy to use. It also helps to quickly navigate through your code, provide error analysis, and offer handy quick fixes. It increases our productivity a lot.

5. Slack


Slack is an amazing app that we all use for team communication. It’s great because we can use it basically everywhere (it has a native app for iOS and Android), and since we’ve been using it, we have reduced emails sent internally. We have a team for each department but also have one for the whole company! It’s quite practical: you’ve got “channels” that you create for whatever you want (projects, teams, topics, etc.) and you can easily navigate between channels. You also have the option to have direct private messages with certain members of your team. The drag and drop is really nice to use, you can share all sorts of files and add comments to them, or tag people. You can also search your full history.

6. Chrome


Everyone uses a web browser obviously, and Google Chrome is one of the most popular out there. I love Chrome because it has a clean and simple UI. I found it to be very fast and I love that it syncs my bookmarks on all my devices. There’s a huge library of extensions and add-ons, it fills out the forms automatically, and I can search straight away from the address bar. Finally, the developer’s console is very quick and easy to use and makes life easier for front-end developers.

7. Feedly


Feedly is quite nice to use. It’s a sort of news feed where you can easily read the news that interests you. It’s very easy to use and has a minimalist design and personalized interface where you can organize all your favorite publications, YouTube channels, blogs, etc. and you then receive updates if there are new stories and videos that are published! It's very practical for staying up-to-date with what interests you. 

8. Jira


Jira was developed for agile teams to plan, track, and release software. Obviously, as we breathe Agile, we also use Jira. It helps a lot when it comes to project management. We found it great because it’s very customizable and has powerful features and tools for every phase of development. In one place, you can manage the team backlog, visualize the work in progress, and generate reports.

9. Git


Git is an open-source version control system for software projects. When a developer is working on something, he or she has to regularly do changes to the code until he or she reaches the last version.
A version control system saves every change made, allowing others to collaborate, make changes, and contribute. You will also find a copy of the code of every developer’s work. Git enables you to synchronize team work and to work with the code, actualizing it at any moment. It’s one of the most popular because it manages very well conflicts (focuses on the file content) and permits you to create ranges of code. I personally believe that if you don’t find Git on a list of software development tools, you should absolutely skip the page.  

10. GitHub


GitHub is a Git repository hosting web. It’s a type of dropbox for software projects where you can find code. When uploading your project, you have the choice of making it public or private. It’s a great place to network and meet like-minded people, share projects, discover others, etc. The community is huge and the project base even bigger.

11. Stack Overflow

Stack Overflow is the bible of any programmer. It’s not because it’s not on top of this list of software development tools that it’s not one of the most important! It’s a question-and-answer site with the largest community of programmers. In this library, you will find all the answers to your questions, going from how to change the color of a text to how to change the kernel of a Linux. It’s a great place to learn and share knowledge.






(8) Explain the build life cycle, using an example (java, .net, etc…)  


Build Lifecycle Basics


Maven is based around the central concept of a build lifecycle. What this means is that the process for building and distributing a particular artifact (project) is clearly defined.
For the person building a project, this means that it is only necessary to learn a small set of commands to build any Maven project, and the POM will ensure they get the results they desired.
There are three built-in build lifecycles: default, clean and site. The default lifecycle handles your project deployment, the clean lifecycle handles project cleaning, while the site lifecycle handles the creation of your project's site documentation.


A Build Lifecycle is Made Up of Phases


Each of these build lifecycles is defined by a different list of build phases, wherein a build phase represents a stage in the lifecycle.
For example, the default lifecycle comprises of the following phases (for a complete list of the lifecycle phases, refer to the Lifecycle Reference):

validate - validate the project is correct and all necessary information is available

complie - compile the source code of the project

test- test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed

package - take the compiled code and package it in its distributable format, such as a JAR.

verify- run any checks on results of integration tests to ensure quality criteria are met

install- install the package into the local repository, for use as a dependency in other projects locally

deploy - done in the build environment, copies the final package to the remote repository for sharing with other developers and projects.
These lifecycle phases (plus the other lifecycle phases not shown here) are executed sequentially to complete the default lifecycle. Given the lifecycle phases above, this means that when the default lifecycle is used, Maven will first validate the project, then will try to compile the sources, run those against the tests, package the binaries (e.g. jar), run integration tests against that package, verify the integration tests, install the verified package to the local repository, then deploy the installed package to a remote repository.


Usual Command Line Calls


In a development environment, use the following call to build and install artifacts into the local repository.
mvn install
This command executes each default life cycle phase in order (validatecompilepackage, etc.), before executing install. You only need to call the last build phase to be executed, in this case, install:
In a build environment, use the following call to cleanly build and deploy artifacts into the shared repository.
mvn clean deploy
The same command can be used in a multi-module scenario (i.e. a project with one or more subprojects). Maven traverses into every subproject and executes clean, then executes deploy (including all of the prior build phase steps).

A Build Phase is Made Up of Plugin Goals


However, even though a build phase is responsible for a specific step in the build lifecycle, the manner in which it carries out those responsibilities may vary. And this is done by declaring the plugin goals bound to those build phases.
A plugin goal represents a specific task (finer than a build phase) which contributes to the building and managing of a project. It may be bound to zero or more build phases. A goal not bound to any build phase could be executed outside of the build lifecycle by direct invocation. The order of execution depends on the order in which the goal(s) and the build phase(s) are invoked. For example, consider the command below. The clean and package arguments are build phases, while the dependency:copy-dependencies is a goal (of a plugin).
mvn clean dependency:copy-dependencies package
If this were to be executed, the clean phase will be executed first (meaning it will run all preceding phases of the clean lifecycle, plus the clean phase itself), and then the dependency:copy-dependencies goal, before finally executing the package phase (and all its preceding build phases of the default lifecycle).
Moreover, if a goal is bound to one or more build phases, that goal will be called in all those phases.
Furthermore, a build phase can also have zero or more goals bound to it. If a build phase has no goals bound to it, that build phase will not execute. But if it has one or more goals bound to it, it will execute all those goals
(Note: In Maven 2.0.5 and above, multiple goals bound to a phase are executed in the same order as they are declared in the POM, however multiple instances of the same plugin are not supported. Multiple instances of the same plugin are grouped to execute together and ordered in Maven 2.0.11 and above).


Some Phases Are Not Usually Called From the Command Line



The phases named with hyphenated-words (pre-*post-*, or process-*) are not usually directly called from the command line. These phases sequence the build, producing intermediate results that are not useful outside the build. In the case of invoking integration-test, the environment may be left in a hanging state.
Code coverage tools such as Jacoco and execution container plugins such as Tomcat, Cargo, and Docker bind goals to the pre-integration-test phase to prepare the integration test container environment. These plugins also bind goals to the post-integration-test phase to collect coverage statistics or decommission the integration test container.
Failsafe and code coverage plugins bind goals to integration-test and verify phases. The net result is test and coverage reports are available after the verify phase. If integration-test were to be called from the command line, no reports are generated. Worse is that the integration test container environment is left in a hanging state; the Tomcat webserver or Docker instance is left running, and Maven may not even terminate by itself.




(9) What is Maven, a dependency/package management tool or a build tool or something more? 


Introduction

Maven, a Yiddish word meaning accumulator of knowledge, was originally started as an attempt to simplify the build processes in the Jakarta Turbine project. There were several projects each with their own Ant build files that were all slightly different and JARs were checked into CVS. We wanted a standard way to build the projects, a clear definition of what the project consisted of, an easy way to publish project information and a way to share JARs across several projects.
The result is a tool that can now be used for building and managing any Java-based project. We hope that we have created something that will make the day-to-day work of Java developers easier and generally help with the comprehension of any Java-based project.

Maven’s Objectives

Maven’s primary goal is to allow a developer to comprehend the complete state of a development effort in the shortest period of time. In order to attain this goal there are several areas of concern that Maven attempts to deal with:
  • Making the build process easy
  • Providing a uniform build system
  • Providing quality project information
  • Providing guidelines for best practices development
  • Allowing transparent migration to new features

Making the build process easy

While using Maven doesn’t eliminate the need to know about the underlying mechanisms, Maven does provide a lot of shielding from the details.

Providing a uniform build system

Maven allows a project to build using its project object model (POM) and a set of plugins that are shared by all projects using Maven, providing a uniform build system. Once you familiarize yourself with how one Maven project builds you automatically know how all Maven projects build saving you immense amounts of time when trying to navigate many projects.

Providing quality project information

Maven provides plenty of useful project information that is in part taken from your POM and in part generated from your project’s sources. For example, Maven can provide:
  • Change log document created directly from source control
  • Cross referenced sources
  • List of mailing lists managed by the project
  • Dependency list
  • Unit test reports including coverage
As Maven improves the information set provided will improve, all of which will be transparent to users of Maven.
Other products can also provide Maven plugins to allow their set of project information alongside some of the standard information given by Maven, all still based on the POM.

Providing guidelines for best practices development

Maven aims to gather current principles for best practices development, and make it easy to guide a project in that direction.
For example, specification, execution, and reporting of unit tests are part of the normal build cycle using Maven. Current unit testing best practices were used as guidelines:
  • Keeping your test source code in a separate, but parallel source tree
  • Using test case naming conventions to locate and execute tests
  • Have test cases setup their environment and don’t rely on customizing the build for test preparation.
Maven also aims to assist in project workflow such as release and issue management.
Maven also suggests some guidelines on how to layout your project’s directory structure so that once you learn the layout you can easily navigate any other project that uses Maven and the same defaults.

Allowing transparent migration to new features

Maven provides an easy way for Maven clients to update their installations so that they can take advantage of any changes that been made to Maven itself.
Installation of new or updated plugins from third parties or Maven itself has been made trivial for this reason.

What is Maven Not?

You may have heard some of the following things about Maven:

  • Maven is a site and documentation tool
  • Maven extends Ant to let you download dependencies
  • Maven is a set of reusable Ant scriptlets
While Maven does these things, as you can read above in the “What is Maven?” section, these are not the only features Maven has, and its objectives are quite different.
Maven does encourage best practices, but we realise that some projects may not fit with these ideals for historical reasons. While Maven is designed to be flexible, to an extent, in these situations and to the needs of different projects, it can not cater to every situation without making compromises to the integrity of its objectives.
If you decide to use Maven, and have an unusual build structure that you cannot reorganise, you may have to forget some features or the use of Maven altogether.





(10) Discuss how Maven uses conventions over configurations, explaining Maven’s approach to manage the configurations 



Maven configuration occurs at 3 levels:

  • Project - most static configuration occurs in pom.xml
  • Installation - this is configuration added once for a Maven installation
  • User - this is configuration specific to a particular user
The separation is quite clear - the project defines information that applies to the project, no matter who is building it, while the others both define settings for the current environment.
The installation and user configuration cannot be used to add shared project information - for example, setting <organization> or <distributionManagement> company-wide.
For this, you should have your projects inherit from a company-wide parent pom.xml.
You can specify your user configuration in ${user.home}/.m2/settings.xml. A full reference to the configuration file is available. This section will show how to make some common configurations. Note that the file is not required - defaults will be used if it is not found.













(11) Discuss the terms build phases, build life cycle, build profile, and build goal in Maven  




Build Phases


A Build Lifecycle is Made Up of Phases


Each of these build lifecycles is defined by a different list of build phases, wherein a build phase represents a stage in the lifecycle.
For example, the default lifecycle comprises of the following phases (for a complete list of the lifecycle phases, refer to the Lifecycle Reference):
  • validate - validate the project is correct and all necessary information is available
  • complie- compile the source code of the project
  • test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
  • package - take the compiled code and package it in its distributable format, such as a JAR.
  • verify - run any checks on results of integration tests to ensure quality criteria are met
  • install - install the package into the local repository, for use as a dependency in other projects locally
  • deploye - done in the build environment, copies the final package to the remote repository for sharing with other developers and projects.
These lifecycle phases (plus the other lifecycle phases not shown here) are executed sequentially to complete the default lifecycle. Given the lifecycle phases above, this means that when the default lifecycle is used, Maven will first validate the project, then will try to compile the sources, run those against the tests, package the binaries (e.g. jar), run integration tests against that package, verify the integration tests, install the verified package to the local repository, then deploy the installed package to a remote repository.

Build Lifecycle


Maven is based around the central concept of a build lifecycle. What this means is that the process for building and distributing a particular artifact (project) is clearly defined.
For the person building a project, this means that it is only necessary to learn a small set of commands to build any Maven project, and the POM will ensure they get the results they desired.
There are three built-in build lifecycles: default, clean and site. The default lifecycle handles your project deployment, the clean lifecycle handles project cleaning, while the site lifecycle handles the creation of your project's site documentation.


Build Profile


Apache Maven 2.0 goes to great lengths to ensure that builds are portable. Among other things, this means allowing build configuration inside the POM, avoiding all filesystem references (in inheritance, dependencies, and other places), and leaning much more heavily on the local repository to store the metadata needed to make this possible.
However, sometimes portability is not entirely possible. Under certain conditions, plugins may need to be configured with local filesystem paths. Under other circumstances, a slightly different dependency set will be required, and the project's artifact name may need to be adjusted slightly. And at still other times, you may even need to include a whole plugin in the build lifecycle depending on the detected build environment.
To address these circumstances, Maven 2.0 introduces the concept of a build profile. Profiles are specified using a subset of the elements available in the POM itself (plus one extra section), and are triggered in any of a variety of ways. They modify the POM at build time, and are meant to be used in complementary sets to give equivalent-but-different parameters for a set of target environments (providing, for example, the path of the appserver root in the development, testing, and production environments). As such, profiles can easily lead to differing build results from different members of your team. However, used properly, profiles can be used while still preserving project portability. This will also minimize the use of -f option of maven which allows user to create another POM with different parameters or configuration to build which makes it more maintainable since it is runnning with one POM only.



Build Goal


However, even though a build phase is responsible for a specific step in the build lifecycle, the manner in which it carries out those responsibilities may vary. And this is done by declaring the plugin goals bound to those build phases.
A plugin goal represents a specific task (finer than a build phase) which contributes to the building and managing of a project. It may be bound to zero or more build phases. A goal not bound to any build phase could be executed outside of the build lifecycle by direct invocation. The order of execution depends on the order in which the goal(s) and the build phase(s) are invoked. For example, consider the command below. The clean and package arguments are build phases, while the dependency:copy-dependencies  is a goal (of a plugin)

Moreover, if a goal is bound to one or more build phases, that goal will be called in all those phases.
Furthermore, a build phase can also have zero or more goals bound to it. If a build phase has no goals bound to it, that build phase will not execute. But if it has one or more goals bound to it, it will execute all those goals































Comments

Popular posts from this blog

Riwas

Web services and SOAP

jQuery