The cue co-ordinates were mapped out as 3's onto the 2-d map, this gives the location of the cue tip.
The cue angle is also an output of the Hough Transform but the angle given needed to be altered to match the physics engine:
Actual Cue Angle = Cue Angle + Theta -90
Theta is the angle of the image rotation, -90 in order to match the Hough angle to the Physics Engine
The ball closest to the cue tip needs to be found in order to apply the Cue angle to it. This is done by calculating the distance of each ball to the cue tip and the shortest distance gives the cue ball. For testing purposes the cue will have a red tip as it makes it much easier to detect.
Testing of the application as a whole project, was to be done by building a 'Database' of 20 images which work in the correct lighting conditions. The image Analysis was very sensitive to the lighting conditions in the room and outside weather can have a big effect on the noise found in the image.
Chris
Saturday, 21 April 2012
Thursday, 19 April 2012
Cue detection
The cue will be detected as a straight line in the image, this means that the hough transform can be used to calculate the angle of the cue. However how can you detect the full 360 degree angle or end of the cue as the hough lines go through the whole image?
The cue has already been filtered using the filtering limits given in the post Revised Image Analysis 2. So find the pixels the hough line and the filtering cross and this finds the pixels the cue is in. The equations to calculate these pixels is given below.
j is the increment through the hough matrix and is always less than rhoHeight, rhoHeight is the size of the hough matrix accumulator in the rho direction
The cue has already been filtered using the filtering limits given in the post Revised Image Analysis 2. So find the pixels the hough line and the filtering cross and this finds the pixels the cue is in. The equations to calculate these pixels is given below.
j is the increment through the hough matrix and is always less than rhoHeight, rhoHeight is the size of the hough matrix accumulator in the rho direction
Tuesday, 17 April 2012
Changes to Code
One Week Until Bench Inspection....AAARRRGGHH!!!
There were problems found in the givens scaling function, sometimes the ball co-ordinates are lost during the shrinking. This is because it is going from a 1024x768 image to a 800x400 model of the table.
In order to overcome this, pixels which contained a 1,2 or 3 could not be overwritten and this meant no data was lost.
The rotation matrix was also altered to that the theta found was negated (theta = -theta) this gave the correct direction of rotation
Chris
There were problems found in the givens scaling function, sometimes the ball co-ordinates are lost during the shrinking. This is because it is going from a 1024x768 image to a 800x400 model of the table.
In order to overcome this, pixels which contained a 1,2 or 3 could not be overwritten and this meant no data was lost.
The rotation matrix was also altered to that the theta found was negated (theta = -theta) this gave the correct direction of rotation
Chris
Thursday, 12 April 2012
Revised Image Analysis (part 2)
In part one it was decided to use the HSV colour space to filter out the unnecessary parts of the image. There are 3 main sections needed; the table co-ordinates, ball co-ordinates and cue angle. After extensive testing the required filtering of each component was found.
To filter for the table the following filtering will be used; 0.22<Hue<0.49
T0 filter for the ball; Hue<0.21 or 0.2<=Hue<0.5 and value>=0.64 or Hue>0.49
To filter for the cue; Hue<0.1 or Hue>0.9
It was decided from the testing that it would be easiest to detect red balls as the values for Hue stayed relatively constant over the testing.
To filter for the table the following filtering will be used; 0.22<Hue<0.49
T0 filter for the ball; Hue<0.21 or 0.2<=Hue<0.5 and value>=0.64 or Hue>0.49
To filter for the cue; Hue<0.1 or Hue>0.9
It was decided from the testing that it would be easiest to detect red balls as the values for Hue stayed relatively constant over the testing.
Tuesday, 10 April 2012
Getting and Using the Pocket Co-ordinates
The Pockets were marked as 1's on the 2-d map of the table. Only the Corner pockets were found, not the centre pockets as these could be added later.
The table was split into quarters as it was assumed the image taken would be close enough to the table so that each pocket would be in a quarter of the image.
The image was then scanned and the 4 corner pockets were found and saved in two arrays; One which stored the X co-ordinates and one which stored the Y Co-ordinates.
Theta (for Rotation) = arctan(Ytopright-Ytopleft)/(Xtopright-Xtopleft)
Xshift = Xtopleft, Yshift = Ytopleft
resize X = Xbottomright
resize Y = Ybottomright
Chris
The table was split into quarters as it was assumed the image taken would be close enough to the table so that each pocket would be in a quarter of the image.
The image was then scanned and the 4 corner pockets were found and saved in two arrays; One which stored the X co-ordinates and one which stored the Y Co-ordinates.
Theta (for Rotation) = arctan(Ytopright-Ytopleft)/(Xtopright-Xtopleft)
Xshift = Xtopleft, Yshift = Ytopleft
resize X = Xbottomright
resize Y = Ybottomright
Chris
Saturday, 7 April 2012
Getting JSON string as an Integer
The Get data algorithm returns the JSON as a string, it is required to convert the string as an integer.
The string contains new lines ("\n") within it so getting the integer was more difficult than it first appeared.
To get the integer from a string , the following JAVA code is used:
Integer.ParseInt(string)
Chris
The string contains new lines ("\n") within it so getting the integer was more difficult than it first appeared.
To get the integer from a string , the following JAVA code is used:
Integer.ParseInt(string)
Chris
Tuesday, 3 April 2012
Text to Speech and New Layout
For reference for the blog, text to speech will be referred to as tts.
Reference to the tts sdk is found at http://developer.android.com/reference/android/speech/tts/TextToSpeech.html
Again, the New Boston gives tutorials on using the Text to Speech :
http://thenewboston.org/watch.php?cat=6&number=187
The tts needs to be initialised, this is done using the OnInitListener, the language is then set to UK.
The phrases for the text to speech are:
Instructions:
Reference to the tts sdk is found at http://developer.android.com/reference/android/speech/tts/TextToSpeech.html
Again, the New Boston gives tutorials on using the Text to Speech :
http://thenewboston.org/watch.php?cat=6&number=187
The tts needs to be initialised, this is done using the OnInitListener, the language is then set to UK.
The phrases for the text to speech are:
- Your Shot will be a Hit
- Your Shot will be a Miss
- No Server Found
It was discovered that the TTS works on a standalone Activity but when the TTs was integrated into the Tab View, it couldn't be initialised.
So a new layout for the application was to be created.
This would be a menu which contained 3 buttons; Start, Instructions and Preferences.
The background colour of this menu is set to be; "009900"
Menu:
Sunday, 1 April 2012
Deadlines
Bench Inspection day is 24th April so the project needs to all be working before then.
The Thesis is due in on 3rd May.
The Thesis is due in on 3rd May.
Friday, 30 March 2012
Linking the three Sections
This is the most important part of the project - if the sections cannot be linked together then we only have 3 individual parts which do not pass information between them.
We've allocated all of the time between now and the bench inspection for linking, testing, and refining the code.
We've allocated all of the time between now and the bench inspection for linking, testing, and refining the code.
Thursday, 29 March 2012
Quitting the Application
It was found that the accelerometer does not turn off when the application had quit by the user and so the phone continued to vibrate. This is because android does not close an application but lets it continue to use some of the memory.
A way of stopping the accelerometer was devised so that the phone didn't continually vibrate.
Method 1.
Use the onDestroy method, code for this is shown below:
protected void onDestroy()//Stops all activity when it is Destroyed
{
super.onDestroy();
// After this is called, your app process is no longer available in DDMS
android.os.Process.killProcess(android.os.Process.myPid());
}
Method 2. - Overwrite Back button to close application when pressed (Android doesnt allow the Home Button to be overwritten) Code for this is shown below:
public boolean onKeyDown(int keyCode, KeyEvent event)
{
if(keyCode==KeyEvent.KEYCODE_BACK)
{
this.finish();
return super.onKeyDown(keyCode, event);
//Overrides the Back Key to quit Activity when it is pressed
}
Chris
Tuesday, 27 March 2012
Extras to the GUI
The application worked at a basic level, in order to make it more exciting to the user extra features could be added to it. The two ideas which were to be implemented were:
- Using the Accelerometer to make sure the user is holding the phone flat, vibrating if it isn't
- Reading whether the shot is to be a hit or miss to the user.
A lot of modern phones running the android operating system contain accelerometers within them, they also contain a vibrate function normally used when the user wants the phone to be quiet.
The android website contains an example of using the accelerometer:
http://developer.android.com/resources/samples/AccelerometerPlay/src/com/example/android/accelerometerplay/AccelerometerPlayActivity.html
There is also the source code for using the Vibrate function which is found here:
http://developer.android.com/reference/android/os/Vibrator.html
When the activity is created, a new sensor manager is set up which 'listens' for a change in the sensor values, the threshold was set as 1 or -1 in both the x and y direction for the vibrate function to be turned on, the implemented code is shown below:
Vibrator v = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);
v.cancel();
// check sensor type
if(event.sensor.getType()==Sensor.TYPE_ACCELEROMETER){
// assign directions
float x=event.values[0];
float y=event.values[1];//Measure the Acceleromter in the X and Y direction
if(x>1||x<-1||y>1||y<-1){//Occurrs if they Exceed defined values
v.vibrate(1000);//Makes Phone Vibrate for 1 Second
}
Chris
Monday, 26 March 2012
Extracting the Data from the Web-server
The application requires that the phone gets the data back from the web-server.
This data is to be a result integer which is a '1' if the shot is to be a hit or a '2' if it is to be a miss.
A method of doing this is to use JSONS (http://www.json.org/)
The JSON is built on two structures:
This data is to be a result integer which is a '1' if the shot is to be a hit or a '2' if it is to be a miss.
A method of doing this is to use JSONS (http://www.json.org/)
The JSON is built on two structures:
- A collection of names and Values
- An ordered list of values
Twitter uses JSONS in order to store a users tweets and they can then be accessed by applications
The New Boston offers tutorials on the application side of JSONS
http://thenewboston.org/watch.php?cat=6&number=150
The tutorial returns the last tweet of a twitter user.
The server then needs to store the integer as a JSON so that it can be accessed using similar code.
To make creating the JSON objects simple for amateur developers, Google have created an API which can be used to create the JSON objects. This source is found here:
http://code.google.com/p/json-simple/
After this the integer from the physics engine was returned to the phone as a string.
Sunday, 18 March 2012
User Preferences
A good application allows the user to modify some of the software properties in order to match their needs and tastes. This is where preferences are used.
For our application, our preferences would be :
For our application, our preferences would be :
- User Name - A string with the users name(Needed if we were to use a database)
- Manually Re-Take Photo, the user can choose whether the application would re-take a photo automatically or do it manually. This is a checkbox and returns a boolean.
- Camera Timer, user can user the length of time the camera waits before taking a photo, can range from 2-5 seconds.
The New Boston offers tutorials on how to integrate user preferences with an Application:
http://thenewboston.org/watch.php?cat=6&number=54
Chris
Tidying up the Code
Before the sections are combined, the code needs to be tidied up. This involves removing all statements that are printed into the command box as the program is running.
As a result, the program will be able to run quicker and more efficiently
Grace
As a result, the program will be able to run quicker and more efficiently
Grace
Wednesday, 14 March 2012
Cue Direction
The cue direction is taken from the Data Acquisition section and passed via the Web Server to the Physics Engine.
It can be used to set the initial ball direction.. One of the initial assumptions is that the cue is always set up to hit the ball straight on so that it can be assumed that the cue ball will always move in the direction of the cue.
It can be used to set the initial ball direction.. One of the initial assumptions is that the cue is always set up to hit the ball straight on so that it can be assumed that the cue ball will always move in the direction of the cue.
Tuesday, 13 March 2012
Updating the Physics Engine
A more Updated version of the Physics engine was ready to be placed on the Web Server.
The physics engine was being tested by using a mouse click to place a ball and create a movement vector on the ball which simulated the shot.
For the web server, this mouse click was undesirable and it was required that the mouse handlers be removed which is difficult since the mouse clicks determine the ball positions.
Instead of the mouse click determining the position of the balls, a 2-d Array which acted as a map was used, this contained a value of '2' where the balls where. The location of the twos gave the co-ordinates of the balls, and so a ball was placed at the co-ordinate.
It was tested using a perfect circle drawn on a rectangle on microsoft paint. This didn't include the scaling algorithm and so the pixel co-ordinate matched the table co-ordinate.
Chris
The physics engine was being tested by using a mouse click to place a ball and create a movement vector on the ball which simulated the shot.
For the web server, this mouse click was undesirable and it was required that the mouse handlers be removed which is difficult since the mouse clicks determine the ball positions.
Instead of the mouse click determining the position of the balls, a 2-d Array which acted as a map was used, this contained a value of '2' where the balls where. The location of the twos gave the co-ordinates of the balls, and so a ball was placed at the co-ordinate.
It was tested using a perfect circle drawn on a rectangle on microsoft paint. This didn't include the scaling algorithm and so the pixel co-ordinate matched the table co-ordinate.
Chris
Tuesday, 6 March 2012
Camera Resolutions
It should be noted that throughout the development of the project, testing is an ongoing activity
The image upload process has been tested on several phones (HTC Explorer, HTC Desire and HTC Desire S) When the program is tested on HTC Explorer and HTC Desire the image has a resolution of 768x1024 but when the HTC Desire took an image, the resolution was 1952x2592. This image was too large a file for the server to accept.
IT was decided that all photos that were taken should be of the resolution 768x1024.
In order to set the resolution of the Camera, several steps must be taken by the software:
The image upload process has been tested on several phones (HTC Explorer, HTC Desire and HTC Desire S) When the program is tested on HTC Explorer and HTC Desire the image has a resolution of 768x1024 but when the HTC Desire took an image, the resolution was 1952x2592. This image was too large a file for the server to accept.
IT was decided that all photos that were taken should be of the resolution 768x1024.
In order to set the resolution of the Camera, several steps must be taken by the software:
- Camera Hardware needs to be asked what resolutions are supported, this is returned as a list of heights and widths
- List is then to be scanned for right resolution
- Once Resolution found, list number is returned
- Camera Set at correct resolution by the list number
The list is also shown to the user in the instructions so that they know which resolution is suported and whether the correct resolution is available:
Sunday, 4 March 2012
Creation of the Tab Layout
The Tab layout view was created showing the 3 tabs which run in tandem with each other. The user can quickly switch between all the activites. Below is the Picture showing the Instructions to the user
Above is the Camera View which uses a surface view to take a picture
This is the Web View which shows the web page output to the user (Note: Server was off at Time of Writing)Wednesday, 29 February 2012
Corner Detection
To account for filtering errors the degrees between each line will be checked 5 degrees either side of 90 so range from 85 to 95 degrees of separation between each line.
Different Directions
One of the aims for the project was for the program to test a variety of different directions. In order to do this I ran a loop for the program to test 10 different angles (adding pi/5) each time.
There is a delay of 1 second while the program sends the ball in each direction.
There is a delay of 1 second while the program sends the ball in each direction.
Creation of the User Interface
The Application requires user interface to link them between the functions.
The first screen displayed would be a 'splash screen' which would be loaded for a few seconds with the application name.
The Android developer page has several layouts pre-defined that can be used by any developer.
http://developer.android.com/resources/tutorials/views/index.html
The best looking with the functionality required was the Tab-Layout view, it would be made with 3 Tabs
1st Tab would be displayed after the splash screen and would contain the Instructions
2nd Tab - Contains Image Capture and File Sending Algorithm
3rd Tab - Basic Web View to View Web Output, this would be very basic so that the user can only view one web page. All other options would be disabled to the user
Chris
The first screen displayed would be a 'splash screen' which would be loaded for a few seconds with the application name.
The Android developer page has several layouts pre-defined that can be used by any developer.
http://developer.android.com/resources/tutorials/views/index.html
The best looking with the functionality required was the Tab-Layout view, it would be made with 3 Tabs
1st Tab would be displayed after the splash screen and would contain the Instructions
2nd Tab - Contains Image Capture and File Sending Algorithm
3rd Tab - Basic Web View to View Web Output, this would be very basic so that the user can only view one web page. All other options would be disabled to the user
Chris
Monday, 27 February 2012
Passing between Image Analysis and Physics Engine
2-D arrays need to be passed between the Image Processing and Physics engine. Functions were placed in both parts to allow the passing of the 2-D arrays.
Chris
Chris
Wednesday, 22 February 2012
Hough Transform Fix
A fix was found for the hough transform by Essex university. The accumulator was correct but the drawing of the hough lines were off set so the new algorithm is array[x][y] instead of array[x][centre-y-1]
http://vase.essex.ac.uk/software/HoughTransform/HoughLine.java.html
http://vase.essex.ac.uk/software/HoughTransform/HoughLine.java.html
GUI Update
The File Upload program works with the Image stored in the server folders.
Modifications were then made to the JSP pages so that the image sent from the phone was analysed by the image process algorithm.
During testing of the application, a problem was discovered in the timing of the camera.
If the timer is interrupted before the timer has finished, the application crashes. Work needs to be done on the exception catching within the Application
Chris
Modifications were then made to the JSP pages so that the image sent from the phone was analysed by the image process algorithm.
During testing of the application, a problem was discovered in the timing of the camera.
If the timer is interrupted before the timer has finished, the application crashes. Work needs to be done on the exception catching within the Application
Chris
Monday, 20 February 2012
Physics Engine - ball direction
As the cue is obtained using polar coordinates, I decided that the simulation of the table should use polar coordinates (angle and magnitude) rather than x and y coordinates. This should make it simpler to link the sections together.
Wednesday, 15 February 2012
Ball Directions
The ball directions are controlled by vectors:
balls[n].velocity.set(xvector, y vector)
It is possible to set these vectors to control the speed and direction of the balls. Only the velocity of the cue ball needs to be set, the other balls will have a velocity of zero until the cue ball collides with them.
balls[n].velocity.set(xvector, y vector)
It is possible to set these vectors to control the speed and direction of the balls. Only the velocity of the cue ball needs to be set, the other balls will have a velocity of zero until the cue ball collides with them.
Revised Image Analysis
When testing different images under different lighting conditions, the original filtering technique (RGB) left too much noise in the image so the table and ball could not be discovered. It was decided to use two different filters one for the table and one for the ball. The technique used will be the hue, saturation and value (HSV) colour map.
Saturday, 11 February 2012
Removing the Mouse Handlers
The mouse clicks and drags have controlled where and when the balls are placed on the table so far. By removing the mouse handlers, it is possible to make the program set up the balls and table automatically.
The function to set up the table with the balls on it needs to be called from the main function in the BallPanel class
Friday, 10 February 2012
PHP Research
Problem was narrowed down to the Server side script
PHP.net ->Site for PHP developers
On the site, there is a page which explains how to handle Post method uploads and this resource was extensively used to learn how the PHP file works
Chris
PHP.net ->Site for PHP developers
On the site, there is a page which explains how to handle Post method uploads and this resource was extensively used to learn how the PHP file works
Chris
Monday, 6 February 2012
Continued Work on PHP file
In order to discover the problem with the FIle Upload program, the Android Developer website was looked at to find how the android code works.
Code makes use of the HTTP post method which is part of the API
"The post method is used to request that the origin server accept the entity enclosed in the request" - Android Developer Page
The code takes the following steps:
-First a connection needs to be made to the Server
- Post Method Request Mage
- Data then sent to the Server until no data left
- Data then sent to server
- Connection then Closed
Code makes use of the HTTP post method which is part of the API
"The post method is used to request that the origin server accept the entity enclosed in the request" - Android Developer Page
The code takes the following steps:
-First a connection needs to be made to the Server
- Post Method Request Mage
- Data then sent to the Server until no data left
- Data then sent to server
- Connection then Closed
Friday, 3 February 2012
Setting Ball Positions
So far, the balls appear when the mouse is clicked and, previously, only appeared where the mouse was clicked. However I've now changed it so that the ball positions are set by the user:
balls[n].position.set(xcoordinate, ycoordinate);
The next stage will be to remove the mouse handlers so that the balls appear when the program is run rather than just when the mouse is clicked.
balls[n].position.set(xcoordinate, ycoordinate);
The next stage will be to remove the mouse handlers so that the balls appear when the program is run rather than just when the mouse is clicked.
Monday, 30 January 2012
Rotation Matrix
One of the issues with the camera is that the table will not be parrallel to the edge of the image unless the person can take a perfect image.
This means the image needs to be rotated to make it straight.
One way of doing this is by using a rotation matrix :
http://en.wikipedia.org/wiki/Rotation_matrix
The co-ordinates also needed to be scaled so that they match the co-ordinates and size in the physics engine.
A class was to be written which would rotate and scale the image so that the co-ordinates could match the physics model.
This means the image needs to be rotated to make it straight.
One way of doing this is by using a rotation matrix :
http://en.wikipedia.org/wiki/Rotation_matrix
The co-ordinates also needed to be scaled so that they match the co-ordinates and size in the physics engine.
A class was to be written which would rotate and scale the image so that the co-ordinates could match the physics model.
Subscribe to:
Posts (Atom)