Wednesday, December 14, 2011

"Exercising" the three prime directives / Hale Aloha command line interface part 2


As the final installment of the Ics 314 software engineering course, we were asked by our professor to further analyze and explore the tools and fundamentals of software development and management through the use of issue driven project management.  Issue driven project management once again is a culmination of several software development tools that in result provides a dynamic process for developing software.  The project is contained within a file repository on a google projects hosting server which allows developers who want to contribute to the project access to the source and build files.
 So in order to obtain a more “complete” understanding of the three prime directives of software development, our professor so graciously bestowed upon the task of building upon code that was not our own.  In the last entry of this blog, we reviewed the code of another group, which in retrospect could be seen as the primer for this assignment, and to allow us some time to become familiar with the foreign source code.  In all honesty, this was quite a difficult and time consuming task.  For me personally, it takes a while to just adjust my thinking to be able to understand exactly what the author of the code is trying to accomplish. Writing code can really be seen as an art form, and takes a lot of concentration to logically draw out what needs to be built and how to organize this in a way that makes sense.  If written sloppily or in a nonsensical way, it becomes quite difficult for another person to look at it and truly understand what the original author is trying to convey. 
Luckily, the code that we received from our groups counter-part, for the most part was written quite well.  There was a bit of added complexity to it because they had separated the control/main program into three hierarchal levels, which in result made it a little more difficult to understand and figure out where to implement the new commands.  On the command level, they used an interface which works fine for the original commands because they all had the same input format, but to try and implement new commands that have different input formats starts to get a little more tricky.  However, we did manage to find a work around to get the new commands to finally execute without disrupting the overall functioning of the program.
The aspect of working in a group on something like this, where the code is not your own adds another layer of difficulty.  Not only are you dealing with unfamiliar code, but there is also a dependency of having other parts being completed by other people first before anything can be tested or run.  I found this aspect of the project development the most troublesome, because there is only so much you can do up to a point where its starts to incur more code just to start testing the classes.  The entire process of compiling and running and testing can also become quite frustrating and repetitive, because everything must be freshly compiled to include any changes that have been made.
In conclusion, there has been quite an exhaustive analysis of the three prime directives with this exercise; especially the third directive which pertains to the developer side of enhancing the system.  It proved to be a much more difficult task due to the design of the program, and goes to show that it isn’t that easy to create modular code, or a product that others can easily interpret and understand.  Although the ideal vision of the project management tool is to have developers be able to view the code in development and enhance the overall state of the project, like anything of considerable complexity that requires dedication and time to become proficient in, there is a learning curve; and what it boils down to is that person’s will to stay interested and learn the system in order to contribute   

Friday, December 2, 2011

Halealoha-cli-jcev Technical Review

    Recently in our software engineering class, we were introduced to a development tool that for an open source software developer, is a godsend.  The concept is a fairly simple one, where a project that a developer wants to work on and develop is open to the public to contribute and continuously add content to improve whatever application is being developed.  This project management tool under the umbrella names of “Continuous Integration” and “issue driven project management” are a culmination of several tools, and together are a very powerful development engine that provides a workbench for developers to build upon a particular project.  So, with our newly acquired knowledge of the Wattdepot API, we were asked to form groups with two other classmates to whether the climate of a group developed project, and build a command line interface for wattdepot using the project management tools.
    As part of the development of our project, we had to create and continuously build upon an associated project page that would provide information for other developers to acquire knowledge about the project.  Part of the issue driven development provides updates on the project page of who is doing what, and what has been completed.  “Issues” are tasks or parts of the application that need to be coded and are assigned to members/contributors of the project.  Continuous integration is the monitoring part of the development, and instead of having the developer constantly download and compile the sources, the CI server automatically does this and updates in real time when there is a compile error.  As an all encompassing lesson for this portion of the semester, after completing our project with group members, we have been asked to also to a technical review of another groups project and their development using the project management tools mentioned previously. Below is my technical review of the group.   
Review Question 1:
    As far as functionality of the program, all of the commands appear to be working properly.  The messages/prompts that appear when an error occurs are quite user friendly.  The messages are clear and concise, and offer a simple explanation for why the error may have occurred.  However, in the beginning of the program when it is first run, it would be helpful to offer a brief explanation to the user of what exactly this program does and possibly some initial instructions of how to enter commands and what to expect when these commands are entered.  The only instructional prompt appears when the “help” command is entered, and may be confusing to a potentially less “savvy” user who may be using the program for the first time.  Overall, the first prime directive has been satisfied in this section of the technical review (the system successfully accomplishes a useful task).
Review Question 2:
The associated project site for this command line interface program is well developed and has been populated with code examples and images of what the interface actually looks like.  What is good about this site is that the navigation through the site has clearly been logically thought out with the user’s experience kept in mind. The links that take the user from one page to the next provides them with the appropriate resources to implement the level of development that they are looking for, whether they choose to simply run the application or choose to contribute and build upon it.  There is a clear set of step by step instructions that the provided for installation and even provides download links to the appropriate software versions if needed.  If there is one thing that frustrates a user or developer most, it is to read about a certain piece of software that must be installed prior to running the target program, and having to locate that particular package on the web somewhere.  This site provides easy access to the software, and is fully self contained and sustainable.
Another user/developer friendly feature is that in the download section, they provide both the source and executable jar as separate files available for download.  This makes the experience for a user who simply wants to run the application much more enjoyable because in order to have the program run, all they have to do is download the jar by itself; they don’t have to download all of the source files in look for the executable in that folder. 
    It appears that all of the commands that are entered into the system produce adequate messages and results for what is expected.  When there is incorrect input or the format is not quite right, the program is able to distinguish between what type of error occurred, and the appropriate message to print to the user.  For example when a completely invalid command such as “test” is entered, the interface responds with an error message of “Error: “test” is an invalid command. Please try help for a list of valid commands”. This explains to the user that they must seek out the valid commands in the help document.  When the user enters a command that is close to a valid command but has the wrong format, the system responds with “Source does not exists”.  This alerts the user that they are getting closer to producing a correct command, but may be entering something incorrectly.   The only sort of issue that I had while testing the system was that I think the help text could have been written a little better with more clarity and detail which explains to the user the commands and how to enter them correctly.  All in all, all of the bases have been covered, and the potential for loop holes of the program have been sealed.  The interface satisfies the second prime directive (An external user can successfully install and use the system).

Review Question 3
The developer guide is very well written and documents each step of the process for developers who want to contribute to this project.  The steps to installing the program and verifying the build are documented in a step by step process, and makes sure that the user who is intent on contributing to the project understands the rules and guidelines they must follow in order to make a productive contribution to the project.  The developer guide lists the Elements of java style as its source of coding standards and even provides a link to this source so contributor can understand and follows those same coding standards.  The project is also listed as being under continuous integration, and provides a link to the Jenkins server so that the developer may also monitor the status of new builds, which is important for the contributor to make sure that their contributions do not create any errors in the build.  Finally, there is some brief explanation of how to generate javadocs and also directs the developed to the pre-generated javadocs contained within the source folder.  A nice added touch at the end of the guide was  support contact for any developer who may have some issues arise while trying to add their contribution.
The sources from svn checkout work without any problem.  The build and verify built and tested the system without any errors, and generated the javadoc documentation correctly.  The contents of the javadoc documentation is very well written, and explains what each function does in a clear and understandable manner.  The architecture of the system supports information hiding, meaning that the structure stays is kept in tact if changes are made, say a new type of command is implemented.  Each command is its own entity/class and is implemented by another class called the command class, preserving the overall structure and making each new subsequent “command class” dependent on this “parent” class.
The source files were downloaded from the developer guide page and compiled with no issues.  Taking a look at the actual java code of each file confirmed the use of the java style guidelines.  The code is not too difficult to follow, there is a little ambiguity of what gets passed where but that may be due to a personal perspective from working on the same program.  The quality of the javadocs generated earlier are evidence of the sufficient amount of comments made on each method/function.   Taking a look at the issues/update page, the work load among each member appears to be spread pretty evenly.  Each member took on equivalent amounts of work, and it looks like they all took control at one point to as far as updating things and fixing problems that arose.  From inspection of the continuous integration server, this group had one failed build during the whole duration of the project, which is quite a feat considering how many builds were executed.  Each build had an appropriate issue associated with it, which shows the attention to detail that these members had for their project.  In conclusion, this group has satisfied the third prime directive (external developer can successfully understand and enhance the system)
    



Wednesday, November 30, 2011

Issue driven project management

In our software engineering course, we have been given the task to construct a command line interface as 

a group effort with use of a unique tool to help us along with the project management.  This tool, an open 

source type of hosting application provided by google allows us to upload our code to a repository and post 

the updates that each group member chooses to be responsible for.  What is good about this type of 

"issue" driven project management system is that it continuously logs the contributions to the project as 

they are made and thus provides realtime feedback to the contributing members of what has been 

completed, is currently being worked on, and what still needs to be started.  The hosting interface is quite 

intuitive, and can be quite a powerful tool if used to its full potential and capability; however the tool is only 

as powerful as the one who uses it to its full potential.  I think that the all-encompassing lesson to learn 

from this assignment was to figure out how to work well with others, and that working in groups especially 

with a project such as this where individuals are responsible for certain pieces of the project, is that it

requires a certain amount of trust in your group to get their part done on time.  

I think that this project management tool helped exploit both our strengths and weaknesses as a group, 

and provides a source of feedback of what seems to work and what could potentially improve the efficiency 

of the group.  I think for our group, we may have not used the functionality of the project hosting to its full 

capability, and that at times it was hard to determine if there was any progress being made on the issues 

due to the lack of updates.  However, when there were updates and changes made, it was clear who had 

done what and what actions were performed for that update.  Another advantage to using a project hosting 

tool such as this is that it minimizes the need to physically meet with the other group members.  Everything 

can be done remotely through the repository, and in some ways makes the whole process of working with 

a group more efficient.   

Monday, November 7, 2011

Wattdepot Katas

Recently in the software engineering class, we had introduced to a custom built software called WattDepot.  This software 

was designed and built for a project to allow the university to monitor power and energy consumption of the school 

dormitories.  As a way to familiarize ourselves with this custom built software, we were instructed this week to develop a few 

short programs to perform basic data retrieval functions from the wattdepot server.    


Initially, the amount of information was a bit overwhelming, and took awhile to decipher how to user certain components 

correctly.  Some of the technical information provided by the javadocs were a bit sparse at times, and at times were not very 

helpful.  However, with that being said, when one spends enough time perusing through every possible document and 

fiddling around with the functions, it became familiar and easier once I got the hand of it.


In total, I probably spent around 8 to 10 hours working on these programs.  There was about a good hour or so just to 

read through the documentation and articles to get a good grasp on what this project was, and how to implement the code 

that would communicate with the wattdepot server.  I had to figure out a system of how to test my programs from the command 

line so that they would talk to the actual wattdepot server, and found an easy way to do this with a simple change to the 

jar.build.xml file that came with the wattdepot download.    
SourceListing:
The first “kata” called sourcelisting was quite simple and took about 5 minutes to write and test.  Using the simple application program that came with the source files as a basis for the program, it was a matter of changing a few things around.
SourceHierarchy:
This kata was a bit of a hassle due to some of the confusion induced by the methods of the classes.  At this point, I was 

still figuring out exactly how to use the methods correctly and so was unsure what to call for the class objects that I had 

created.  At first I had used the owner from the source class thinking that it would indicate some kind of network of the source 

hierarchy, but found that that was not correct when I printed out the results.  I am not certain that I used the correct methods to 

determine the source hierarchy, however I did receive results that looked promising with the isSetOwner method, which 

shows some form of ordering through inspection of the source name.  In the end, I probably spent about an hour to an hour 

and a half on this implementation.
SourceLatency:
This kata was pretty simple to do because it was a bit of a rehash of the simple application code.  However, I spent a 

good chunk of time wondering how to calculate the latency.  I actually skipped this problem and went on to the others, and 

came back and did this after.  After looking at the simple app code, it was quite easy to devise some code of how to do 

calculate the latency.  Including the amount of time I spent pondering how to do the problem, it took me about an hour or so to 

complete.
EnergyYesterday:
This kata was the one I probably spent the most time on.  It was the first problem that involved the energy retrieval and 

multiple timestamps.  After tinkering with the methods for awhile, and testing to see that I was actually implementing them 

correctly, the issue of how to sort and display the corresponding source name took some time to solve.  This was particularly 

time consuming because in the software package there is no inherent method for sorting or mapping.  So I actually 

implemented a hash map, which is probably not the most efficient process due to some lag time when the program executes, 

but it seemed to do the trick.  With all the hurdles and lessons learned from building this program I spent about 2 1/2 hours. 
MondayAverageEnergy:
After doing the EnergyYesterday program, this was a breeze because I had done all the work already for the previous 

program.  This was a simple matter of changing around the timestamps and averaging the values of the collection points.  

There were a few issues while debugging the program, but found most of the problems were syntax errors.  This took about 

hour to complete.
HighestRecordedPowerYesterday:
Unfortunately, I did not get this program to work correctly.  I did get some results from the power retrieval method, 

however, they don’t seem to make any sense.  I think the problem may be the time interval that it is collecting at. I assumed 

that the timestamp increment method was the only way to continuously change the hour, however when printing the results of 

the collection times, the data didn’t seem to change.  I actually worked on this program, but then moved on and came back to 

it later.  I spent a total of 2 hours on this problem.
    In conclusion, this was a very time consuming task, due to the fact that we were learning how to implement custom made 

software, and the extensive amount of parameters and collection processes.  All in all, it was a good learning experience and 

gives a sense of what it takes to develop software for a system.  There is obviously a whole lot more functionality built into the 

software, and for this assignment, it seems as though we’ve barely scraped the surface of manipulating the data. 

Thursday, November 3, 2011

Renewing our sights on Renewable Energy

        In most recent years, it has been quite astounding to witness such a collective shift of consciousness and awareness 

towards human effect on the natural environment.  In 2007, it seemed as though the whole world had been swept over with 

this newly coined term of “going green” to imply that people were now taking the initiative to be more environmentally 

conscious and responsible for their energy usage and lifestyles.  Although this has been a definitive step forward in the right 

direction, there is obviously a much larger need for action and attention on these issues from a societal/communal 

perspective, and that we should be allocating a lot more financial resources to the development of sustainable/renewable 

types of technology.  There is no doubt that as our societies continue the trend of consumption at their current rate, our global 

situation will continue to grow ever dire.


For the state of Hawaii, our solution for both economical as well as ecological stability seems to be staring us in the face.  

With our geographical location, we are a perfectly suited host for green technology development and implementation, and 

should be at the forefront of this movement, setting the bar for the rest of the country.  Being completely isolated on an 

independent energy grid should be all the more reason why we need to severe our dependence on imported oil.  Not only is 

our dependence on oil growing and negatively affecting our environment, it negatively affects our economy as well by 

sapping the infrastructure of money and jobs.  By locally developing technology to utilize renewable energy sources, we 

would not only ensure the protection the environment, but we would also be able to stimulate our local economy by retaining 

money locally, and creating meaningful technology jobs.


It is refreshing to see that, although however slowly, renewable technology development within Hawaii is  starting to 

come into focus.  Associations within the University of Hawaii such as HNEI (Hawaii Natural Environment Institute) are 

working with energy companies to develop metering devices to equip consumers with feedback tools so that the consumer 

will be able make more conscious and responsible consumption choices.  Things such as these metering devices could 

revolutionize the way we view and think about energy consumption in our households, and possibly spur a much greater 

awareness than itself.   


As the way our American culture goes, however, capitalization on consumerism by large corporations is what primarily 

drives or should I say puts a cap on our progress towards what we are actually capable of achieving as a country.  This is not 

a completely isolated pessimistic view because in line with some current events, there seem to be a lot of people who feel 

polarized on the idea of large corporations holding the balance of the nations economy.  Ultimately, there needs to be a 

corporate mind shift to focus on what is absolutely necessary, and what we should really be contributing our efforts and 

energy towards.     
          

Tuesday, October 25, 2011

5 questions that may help to study for the midterm (but maybe not)

Q1:  What equivalence relations hold for an “object.equals” method?
Reflexivity, Symmetry, Transitivity
Q2:  What are the two primary purposes of Code Review?
  • To ensure efficiently quality of the code that is being released
  • To learn when and how to apply techniques that will improve the codes quality, consistency and maintainability.
Q3:  Write a simple Robocode program that would make your robot move forward 50 pixels, spin around in a circle and then stop.
simpleRobot extends robot{
while(true){
ahead(50);
turnRight(360);
stop();
}
}
Q4:  Name four available checks that checkstyle can implement, and what they do.
  1. Empty Block - checks for empty blocks of code
  2. illegal type - checks that particular class are never used
  3. Indention - checks correct indentation of code
  4. LineLength - checks for long lines
Q5: Name 3 suggestions/reasons from the reading over why you should use Git rather than SVN
  • Git is distributed and delivers a complete copy of the history of the project, and in this way can be more efficient than svn
  • It is faster and easier to create branches
  • you can commit changes in stages

Tuesday, October 18, 2011

SVN and Google Projects Hosting


As we continue to spelunk into the vast realm of software engineering and explore its extensive catalog of developer tools, we came across a real gem this past week, something quite invaluable to any application developer.  The tool is called SVN and is a configuration management program that allows other developers to download your source code and make contributions to it.  In conjunction with this tool is a Google powered hosting site called Google projects hosting which allows the developers to “broadcast” their code to a targeted community.  With Google projects hosting one can even attract and build a community of their own that is centered on a custom project that they have posted. 
                Though this tool is an extremely powerful and useful tool for developers, it is strictly that, a tool for developers.  Initially, it was not very clear to me what this was for or what we were trying to accomplish, but soon after it became blatantly clear.  The initial experience of getting everything synced up was a bit convoluted (which is why it is an application for developers), but after the setup, it was quite simple to commit changes to the project.  It was only a matter of minutes after the setup that I was able to upload and see the changes that I'd made to the system on the hosting site.  One of the more interesting aspects though which really amazed me was the speed at which people could post their updates.  Whenever someone would change something in a file for instance, it was an instantaneous update that appeared on the hosting site; and after re-syncing the SVN application with your system the new additions would appear in your system.  The ability for developers to have instantaneous feedback and contributions from other developers is what I think has probably allowed technology these days to progress so quickly, and the exposure to this tool has been somewhat of an epiphany for me of the potential that it holds.
                The other half of this assignment was to create our own project hosting site.  After navigating through the process of locating all the necessary things for an upload on the other hosting site, this was quite simple to do on my own.  However, the only thing that was a bit of a hassle was uploading the files to the correct directory.  I assume that there is a more efficient way of doing this but to get this done, I had to individually upload each file from my system into the trunk directory of the hosting site.  I guess there is probably some way to configure svn to do this, but from the administrator stand point this would be much easier to if there were a mass file uploading feature in Google hosting.  
 Hosting Site:
http://code.google.com/p/robocode-eje-bumblebee/