These are projects posted by the students of Dr. Gove Allen at Brigham Young University. These students have taken one semester-long course on VBA and generally have had no prior programming experience

Wednesday, April 25, 2012

Android SMS App

Short Message Service (SMS) is an important means of communication.  According to Wikipedia, more than 7 trillion SMS messages were sent in 2011.  To prevent SMS spamming, service providers have placed a maximum limit of 10 recipients on any given message.  As the technology evolved, the ability to send SMS messages through a web service provider has made sending messages programmatically more efficient.  However, using a web service provider makes receiving responses awkward, at best.  To compensate, this project attempts to create a client side Android application that would automate sending an SMS message, to more than 10 recipients, through a smart phone’s SMS application.  The major advantages of this application are automation for the sender, to more than 10 recipients.  In addition, this application provides the recipient, the ability to easily respond, directly to the sender’s phone.

The Android application is built on the Android API version 2.3.3 or API 10, because this API services nearly 80% of current devices on the market.  The application uses an Open Source API called JExcel, to connect to an Excel document and obtain data from the first two columns, of the first sheet of a workbook stored on the device’s “/data/” folder.  It then runs through a loop to send an SMS message to each recipient listed in the Excel document.  Overview of the system begins with a simple splash screen asks the user for a file name and a message.  The message is limited to a 160 characters, and anything over is truncated.  After supplying the file name and message, the user has two options, either send the message or exit the program.  Once the send message button is pressed, the program begins to parse the Excel data, store it in an array, then call the SMS application to send each message individually.

Wednesday, April 18, 2012

Helping 301 Students Organize Financial Statements

This past year, I have had the privilege of co-teaching the Business Management 301 class in the Marriott School of Management. The students in the class often have trouble organizing data into the standard financial spreadsheets. In addition, the students often have a difficult time remembering and calculating the basic 15 financial ratios that are learned in the class.

My final project seeks to alleviate some of that pain by allowing students to check their work by inputting the basic financial information of a company and seeing that information displayed very simply in all 3 financial statements. This would be a supplement to the work that they have done already, and would be a “check” on all things that they’ve done. In addition, the “Ratios” tab that is created in the spreadsheet calculates the ratios of the company so that students can check their work.

In addition to the ratios calculated from data on the “Data Entry” tab, a few additional ratios are retrieved from Yahoo Finance via a Web Query. If the stock is publicly traded, the student can input the stock ticker into a cell and see the P/E ratio, EV/EBITDA ratio, and Beta of the stock for the most recent period. This information is very valuable for students who are beginning in finance, and would be something that I would send to students so that they could learn more quickly.

Here are the files:

Monday, April 16, 2012

Joe Hilton - solar field simulator and optimizer

The Problem:  solar power generation can be a very capital expensive process.  Most of this capital is deployed in building the solar field which contains hundreds or even thousands of individual heliostats (a heliostat is a mirror or other glass pane attached to a motor and gear that enable it to track the movement of the sun).  In a Power-Tower solar configuration, all these individual heliostats track the movement of the sun and position themselves to reflect the sun’s rays to a central tower where the heat is gathered and used to generate usable energy (watch the movie Sahara to see an example of what this looks like).

Obviously, the plant owner wants to know how much energy can be expected from a given solar field.  Furthermore, the plant owner also wants to know what the optimum configuration of a solar field should be in order to maximize energy collected for capital deployed on heliostats.  The trick is that the wave-particles in each sun ray (called photons) do not always behave as expected.  They can bounce in funny angles.  They can be blocked by clouds.  They can be attenuated by dust.  They can do all sorts of unpredictable things on their way from the sun to heliostats to the power tower.  Given that building a solar field is a very expensive proposition that requires a plant builder to be born full grown (as in, it won’t work unless the entire field is built all at once), the problem is to accurately predict the performance of a theoretical solar field, and then to optimize that field to get maximum performance for a given number of heliostats – and dollars.
The Solution: basically, my project builds various theoretical solar field configurations based on the building sites and heliostat materials available, and then it runs Monte Carlo simulations on those sites on sun positions throughout the year based on the user’s inputs for the latitude of the building site.  My writeup goes through the steps I took to build this solution with detailed explanations on how I did what I did, why I did it, where the limitations are, and what could be done to extend the scope of the project.

Here are the files:


Jared Miller
VBA Final Project Executive Summary

This project searches all 268 different craigslist U.S. sites to see the different design jobs that are available. It pulls all the jobs that I insert into the search criteria. Then the program removes all old data, keeping only the listings listed today. Next it pulls in the corresponding links to an excel spreadsheet, opens the sheet and searches the page to find the information I need to decide whether I should look more into the job (e.g. compensation). While this program was written specifically to look for design jobs it can look across all craigslist sites for a number of different inputs including houses, cars, etc.

Bank Impairment Worksheet

The purpose of this project in VBA is to facilitate the use and uniformity of an impairment analysis of bank assets. In accordance with Financial Accounting Standards (FAS) 114 and FAS 5, every loan held by First Utah Bank must be analyzed for impairment. In layman’s terms, this means each loan must be analyzed to make sure that the bank will get back all of the principal and interest owed. This is a very strict rule enforced by the Federal Reserve, which audits the bank’s loan portfolio semiannually.

In the past, the documentation for this process was fragmented and unfamiliar to those loan officers who were required to fill out these forms. There were many questions as to what needed to be included in an impairment analysis. For each loan officer, you would find a different approach to justifying why or why not a loan may be considered impaired or not. In addition to incorrect approaches taken, other problems with the process existed, mainly, human errors when inputting information into the document, and failure to update and print out all the new documents semiannually per each audit from the Fed.

When auditors from the Federal Reserve would see multiple approaches to doing an impairment analysis (many of them very incorrect) or errors in an impairment analysis, it reflected very poorly on the bank’s management and resulted in tighter controls and regulations, which, from the bank’s perspective, is universally deplored.

In order to facilitate the uniformity of the impairment analysis document, this VBA project includes a much more friendly form to use, with only minimal inputs required by the user of the form. Any time that information required on the form can be obtained from the bank’s customer database, that information is pulled automatically from the database rather than having the potential for human error introduced into the system.

Finally, when the user is done completing the form, there is a button that the user can press which will print all the forms created. This is a massive time saver for the user, who in the past would have to go in and individually print out dozens, if not hundreds of documents.

Below you will find a complete walkthrough of the form, including how it is to be used and what is needed to complete the form. There are five sections to the form, with each section being detailed below.

Saturday, April 14, 2012

Pianist's repertoire database

My VBA project establishes a database for piano teachers, their students and the students’ repertoire in 5 different musical periods: Baroque, Classical, Romantic, Impressionist and 20th Century. This database includes all details of a pianist’s entire repertoire, including era, composer, the piece title (and any accompanying nickname), and the catalogue number if applicable (i.e. “Opus” or “BWV #”, etc.).

When the form “PianoRepertoireForm” is opened, the user first selects a teacher listed in the teacher list, and a list of students pops up in the student list. The user then selects a student, and all 5 musical eras populate of their entire repertoire. A musician is considered to have a piece in their repertoire when they have thoroughly learned a piece and given a performance of the work (preferably memorized for pianists).

This enables the user to quickly gain a visualization of their repertoire, seeing the big picture regarding their life’s musical work and giving them perspective about new works to learn, their next recital program, goal-setting, big projects to undertake and a desire to resurrect works previously learned for future performances.

Users that would greatly benefit from such a database and visual list would be:

- Piano teachers and professors who want to see what their student’s repertoire is, quickly visualizing where strengths and any “holes” that might be in that student’s repertoire.

- Piano students interested in seeing their own repertoire for strengths and holes, and also to compare their repertoire with their peers (to such repertoire lists made public).

- Professional orchestras, recital and concert series, artist agencies, universities and music conservatories and even private teachers looking to recruit specific students with specializations or interesting recital programs.

- Pianists needing to submit repertoire lists for auditions, applications for university/conservatory, competitions, etc. This gives them a ready-made list that is easily edited by simply selecting the “add new work” drop-down feature from each era.

Friday, April 13, 2012

Your Personal San Francisco Craigslist Search

This project was intended to make housing searches on Craigslist more efficient. It was created for personal use only and has no business advantage. The motivation behind the project was an imminent move to San Francisco. I will be moving one week after graduating from the MBA program, and have a very busy course load of 19.5 credits. The housing market in San Francisco is competitive and requires a large time commitment to online searching, almost all of which is done on Craigslist. My intention was to automate the craigslist search process in San Francisco and thereby save myself a lot of time doing repeat searches. My VBA program is intended to allow the user to enter search criteria from an independent user form in Excel. The program will then collect the data from the user, plug the data into the appropriate fields in the craigslist online search, and then deliver the search results into an Excel spreadsheet. The user (myself) would then be able to view the results of the query loaded into one place, and organized into categories of 1) listing date 2) price 3) bedrooms 4) the link to the posting. If the listing is of interest to the user, they can then click on the link to view any images connected to the posting, and from there contact the lister.

House For Rent - KSL Classifieds Web Scraper

I have recently begun a new job as the Director of Acquisitions for a private equity real estate firm that is investing in single family homes in Salt Lake County. In assessing the investment potential of each home, it is very important to be familiar with the current rent potential for each neighborhood, submarket, etc. In order to gain the necessary market knowledge to make prudent investment decisions, I am very interested in researching the available homes for rent on in a manner that allows for quick access and aggregation.

With this in mind, I have sought to build a web scraping tool that goes into’s classifieds and searches for homes for rent along the Wasatch Front. The tool will then import that information into a spreadsheet where it can be aggregated, sorted, and manipulated for purposes that relate more specifically to my employment.

Tax Lien Aggregator

Executive Summary

Final Project Description

Investors that seek substantial returns by acquiring properties that are distressed usually pay a subscription fee to local firms that provide listings of these properties. These firms typically include title companies, data aggregators, or others that specialize in gathering such information for local real estate professionals.

There are a lot of publicly available websites that provide various types of information regarding distressed properties. These include sites such as,,, etc. Most sites with reliable information require a subscription fee. However, most properties that are distressed have property taxes that have not been paid. County websites list those properties that have tax liens against them for back taxes that have not been paid. In Salt Lake County, this website is:

The problem with this website is that is provides limited information that in and of itself is not very useful because it doesn’t include the address information for the property, only the parcel number. A potential homeowners or investors looking for distressed properties to purchase in Salt Lake County have to access two separate websites in order to determine the address of properties that are subject to a tax sale due to delinquent property taxes. Another website,, can be used to find a property’s address by entering the parcel number into a search query. This makes it virtually impossible to look for distressed assets in specific geographic areas.

This final project is a web query that aggregates the tax lien information from the first website and address information from the second website. In addition, the second website has an interface with Google Maps that provides coordinate information. This coordinate information includes longitude and latitude that can be used to plot the location of the property in Google Earth Pro.

The query will be limited to those properties that are subject to a tax sale. This will allow a homeowner or investor to visually see the location of all properties in Salt Lake County that will be sold at auction by the taxing authority if the back taxes are not paid in full within five years.

Code Elements

The code that was written for the final project contains three parts: the webQuery sub-procedure to query the county website, a startingRow function that determines where data should start being entered on the query spreadsheet, and the obtainAddress sub-procedure to query the assessor website.

When the county website is queried it lists only 200 properties on each page, and the user has to navigate to the next page to see the next 200 properties. The webQuery sub gathers parcel #, owner name, category code, category description, status, balance due, first delinquent tax year, and tax sale year for each property on each page. It starts on the first page and extracts all the information from that page and stores it on sheet1. It then moves on to the next page and loops to each subsequent page. Each time the query is run, it can download information on 200 properties.

The startingRow function evaluates the row where the last entry was stored from the page last queried and indicates where the first entry of the current page being queried should be entered. The startingRow function allows the webQuery sub-procedure to loop its query on each page of the county website without overwriting information that was already saved on sheet1.

Whereas the county website provides information on 200 properties at a time, the assessor’s website provides information on only one property at a time. As such, it takes a significantly longer amount of time to gather address information. Whereas, the county website can be queried in only a few minutes, the assessor’s website may take hours. For this reason, the query for the assessor’s website is broken out into a different sub-procedure, allowing the user to run each query at different times.

Due to differences in the way the two websites are designed, the obtainAddress sub-procedure needed to make significant adjustments to html text in order to make it useable. That is one reason why the code for this sub-procedure is so long.

Learning Outcomes

I didn’t realize when I started this project, that I would have to learn a substantial amount of additional VBA functionalities and HTML properties that were not covered in class. To begin with, the county website presents the results of its query in an html table. I had to learn how to extract information from an html table. I found,, which discusses how to do this and provided some code that I used in the final project. I still had to manipulate this code so that it would work on the county website. Figuring this out took the majority of my time on the project.

For the obtainAddress query, I was able to use the code that was written by Professor Allen in the agent.xlsm file for querying websites. I still had to manipulate the code in order to get it to work for the assessor website.

In total, I would estimate that I spent fifty hours on this project. It was very challenging, but also very rewarding.

Results and Next Steps

The queries work beautifully and I was able to download information for 525 properties. Obtaining the addresses took almost two hours, but because it worked so well, I was able to let it run by itself. If I had $300 to purchase Google Earth Pro, the next step would be to import the excel file into Google Earth Pro to see the location of each property within Salt Lake County.

Additional functionality, which I may add in the future, could include removing all parcels with no address from the file and cleaning up the address lines for street types that do not appear in the correct column. Additional query information could also be gathered from the assessor’s website including obtaining land record, residence record, tax valuation, adjoining values, neighborhood values, and parcel characteristic information.

Here are the links for my files:

VBA Tax Preparation Program

I wanted to write a program that would help others figure their tax liability at a low cost. The biggest problem that I was trying to solve was that Americans don't know how to do their taxes. This program would provide an easy and cheap way to file your tax return.


Recipe Organizer

Preparing and consuming food may be a mundane task for some, but for me it is a pleasure that deserves to be well established and organized. So I attempted to organize some of it by compiling and categorizing recipes in a more convenient visual way.

Financial State of the Union

After further analyzing the situation, it became clear that the most important solution would be a time-saving solution.  Discussions with the boss revealed that there was a tremendous amount of time used gathering simple financial information in the effort to understand the “bottom line,” and that is where the idea for this project began.

The household where this project will ultimately reside is no different than many today – there are multiple sources of cash outflow (viz. credit cards) and a similar amount of cash reserves (e.g. bank & checking accounts).  With recent legislation passing that effects the consumer credit industry – it is ever more important to be on top of monthly payment.  Calendar dates when bills are due can change every month, so the idea behind this project is to create an easy to use tool that automatically gathers the account balances, outstanding charges, and most importantly – due dates of the major credit cards being used.  Phase two of the project will mirror the functionality for banking accounts.  Ultimately, this tool decreases the number of times during the month when a credit card user visits the online account center.

Thursday, April 12, 2012

I am currently engaged and didn’t quite realize the whirlwind that was going to overcome me during this time. It isn’t so much the engagement that is overwhelming but everything going on in addition to the engagement: semester end projects, finals, job fly outs, etc. I decided to use my VBA final project to do something that will truly be useful for me and others involved in the wedding planning.

Sending out wedding invites is a big deal. It requires coordinating the efforts of a lot of different groups. After several weeks of several different directory versions flying around between the two families, I decided to create a user form that will be so simple that no one could possibly screw it up. I created a user form so that regardless of how tech savvy an individual may or may not be, they would be able to easily check if an individual is in the directory, add an individual, and know the count of guests according to different categories. The mission is clear. Now for the difficult part: implementation.

KSL Cars Classifieds Search and Regression: By Aaron Rickett

Executive Summary
My spreadsheet program is designed to help car buyers search the many advertisements on the KSL website to find the best deal. The program will first allow the user to input their search criteria. Then the program will interact with internet explorer to pull the listings from the KSL website and sort them into their own sheets by make. After the listings are pulled, the program will perform an ordinary least squares statistical regression using year, mileage and seller type as factors or predictors of price.

I designed this project to help with the task of buying a car for my own personal use. Since I am a college student I need to know that I am getting a good deal. However in the future I would like to use this project to buy cars of high value which I could then resell for a profit after fixing minor mechanical problems on the cars.

Of key importance to this potential business is finding undervalued cars to buy. Many of the advertisements are far overpriced to be worth buying. This is further complicated because there are many factors to consider to each car and it is difficult to determine exactly how much each one should be weighted when buying a car. For example is it a better deal to buy a 2010 Corolla with 25,000 miles for $13, 000, or a 2004 Corolla with 150,000 miles. Since there is more to compare than simply price this sort of analysis requires regression to best understand each factor influence in the price of the car. This tool allows the user to see which cars are undervalued and by how much thus providing higher likelihood of profit at resale.

In order to find a good deal without this tool it could potentially require many hours to pour through the many listings. This program will automate this process and pull the key factors of each car so that the user can get a good overview. The program also pulls the URLS of each listing so that the user can easily view the listings of highest value. These features of the program make finding the car easier and less time consuming.

The following is a list of the major components of the program with a short description of each one.

Ribbon Customization.
In order to make this program easy to access and run, the Excel’s ribbon has been modified. There is a new tab called Regression which has a button to start the program by calling the user form.

User Form.
The user form allows the user to in input data similar to the data required to interact with the KSL website. However unlike the KSL website which consists only of check boxes, the user form consists of list boxes, text boxes and a couple check boxes. Input data is checked for validity and modified if needed before being passed to the web query.

Distance Query
The distance query allows the user to get additional functionality that the KSL form does support. The distance query was a tool similar to the web control that used to find the distance between all counties and all other counties in all six states in order to estimate the distance between a given zip and all other zip codes in that area. For speed of processing this information is held in the hidden sheet “DistMatrix”. This information is used in the user form to create list of zip codes that are within a certain distance of a central zip code. This list is the passed to the web control.

Web Control.
The web control accepts the data from the user form and then accesses the internet and fills out the form on KSL and pulls up the matching listings. This is done using a special agent class. Once the listings are pulled then web control finds the name of the link and the time it was posted, for each individual advertisement and passes them to the web query.

Data Extraction.
The data from the listings is extracted by using the web query wizard tool in excel in automated VBA fashion. It receives the name of each link and uses it to pull down the whole page and copy to a hidden sheet called “dropsheet”.

Data Collection.
This component searches the dropsheet after each listing is dropped to find the key pieces of information including, ad number,date, price, year, seller type, mileage, sold or not sold, city, state, zip, cell number, phone number, contact name, URL, and description. This information is sorted by into a sheet whose name is the model of the car of which the information is held.

Data Checking.
After all of the data is pulled. The data is checked in three parts. The first two parts check whether the data is suitable for a regression. Then third check is used to check if the car is truly for an investment opportunity.

Matrix Loading.
The data is read into the variant data type to be prepared for regression.

The regression is executed using ordinary least squares and produces model coefficients, one for each of the factors, year, seller type and mileage. This regression technique relies on mathematical matrix operations.

Matrix Operations
This section includes a bundle of tools used to get the matrix inverse, transpose, and multiplication used to do the regression.

Utilities.The utilities are functions that I made myself to ease several computing processes used throughout my code.

Detailed Implementation
Upon opening the KSL Car Regression Tool workbook, the user will see the welcome page with brief instructions for using the tool. This welcome also warns users not to modify information in the hidden tabs. Fortunately the user should never have to know what is in the hidden tabs.

Figure 1: Welcome page and Ribbon Modification

Ribbon Customization
This button allows the user run the program without accessing the code. This is a good security measure so that a user does not accidentally change any code resulting in a program that no longer works properly.
Upon clicking the start regression button the user form will appear.

User Form
The user uses the user form to select the car specification to run the search and the subsequent regression. Since there is only a need to enter in models the makes are there simply for convenience of the user. KSL will work fine if it you never enter the make of each model.

Figure 2: Select Make

Upon selecting a make from the first list box the model one will instantly populate with the corresponding models for the make selected. The user can then select a model and then click add to move it to the list box of selected models to use in the subsequent steps of the program.

Figure 3: Select Model and click Add
If you decided to remove models that you had previously selected then you can select the model you wish to remove and then click remove. Removing all the selected models can be accomplished by clicking remove all. The makes and models information originated as a text file of a database language which had to be parsed in order to extract the makes and models.
The other text boxes work as one would expect. You can enter in any text you want, but the data will be checked for suitability before it is used by the web control. The delete previous option will enable the user to start fresh and will clear all worksheets with information from prior regressions. Alternatively, the user may to update previous listings by reselecting the model. This will cause any additional listings to be added to the same sheet for that model.

After makes are selected and other information is entered the user can click OK to begin, at which point the data starts to be processed. Each of the pieces of data is checked. The min price is checked to be less than the max price. The min year is checked to be less than the max year and to be less than the current year plus 1.

One interesting thing is that KSL only has check boxes for every 10,000 miles and price for almost every $1000. The mile and price data needed to be modified by rounding it in order to agree with the checkboxes in KSL Car Classifieds.

KSL uses numbers for all of their buttons names. The button names for zip codes and makes do not correspond, but for the other information it does. To speed up process time these codes were copied to a text file and parsed and then stored in the workbook. The user form creates arrays with the zip and model button numbers for use by the web control. To see the hidden button numbers see the hidden sheet “Make_Model_Lemon”.

The user form then processes the information received from the distance query. Creates new sheets for each model not previously searched and calls the web query and regression subroutines.

Distance Query
Notice the zip code and distance text boxes. These boxes and nothing with similar functionality is found on the KSL Cars classifieds website. These boxes allow the user to specify a distance from a central zip code, from which they are willing to travel to look at a car they may be interested in buying. After clicking ok, the zip code is matched with a list of zip codes of all 6 states in which KSL operates on the “Make_Model_Lemon” worksheet to find the corresponding county. The county is then found on the “distMatrix” worksheet. The row on which it is found contains the distances between that county and all other counties comprising 224 counties in total. These distances are compared to the distance entered in the textbox. If the distance in the row is less than the distance specified by the user then the corresponding county is selected and placed in an array. Then all counties are used back on the “Make_Model_Lemon” worksheet to obtain all zip codes within those counties for individual entry as checkboxes on the KSL website.

Figure 4: The Table that holds the distance information

Web Control
The web control uses the processed data from the user form and the agent class and methods to control the internet by finding key elements of the HTML code of the KSL website and then executing them. In the web control there was substantial error handling because even with the rounded prices and miles amounts, not every mileage amount exists. Without error checking the web control would hang if it tried to access a check box that does not exist. For this reason the max mileage was increased in increments of 10,000 until a match is found, and the min mileage is decreased by 10,000 until a match is found. The prices button numbers were similarly handled but in increments of $1,000.

Each of the sets of buttons is cleared when the web control loads a new model. This is because if KSL runs again in the same instance of internet explorer it will remember the previous values from the run. The commands for the selecting the model are strategically placed last so that so that the other specification do not need to be included in the loop, since all the specifications are being applied to all of the models selected. After all information is loaded the load results button is accessed.

Figure 5:Illustration of the Web Control submitting data

Then the web control parses to find the number of results returned. If it finds 0 the next model immediately begins loading.

If results are returned the Web Control continues to parse through the HTML until it finds the link to the page containing all of the information for each listing.

This link is then passed to the Web Query wizard for extraction.

Data Extraction
The data extraction uses the link from the web control to copy the information from the specific listing page and dump it into a worksheet called “dropsheet”. This is by far faster (and easier) than using the web control to do the same job.

Figure 6: The Drop Sheet

Data Collection
With the information in the drop sheet, data collection can now pull the information from the sheet and organize it in a sheet with the models name. This part included utilities to remove unwanted characters from the data that are pulled.

Data Checking
After all listings for all models are dropped collected and sorted. The data checking begins. For each sheet containing data from the listings, data checking begins.
There are three pars to data checking.

1. The first part checks for duplicate listings and removes duplicates. Since KSL default list order is newest to oldest many people relist the same car very often. These duplicates have the potential to skew the results of a regression, so they must be removed first.
2. The data used in the regression is checked for completeness. If any number is missing it is either deleted or set to 0. It is set to 0 only if it is the mileage and the car is a new car sold by a dealership.
3. The second par checks for any information in each cars description that identifies the car as unsuitable for an investment opportunity. It does this by checking a listing of words that are used to describe the poorest quality cars. If any of those words are found then the information is deleted from the sheet. The list of words comes from the hidden sheet “Make_Model_Lemon”.

Matrix Loading.
After the data is checked it must be loaded into variants which can act like matrices. The prices are all loaded into a variant with 1×n dimensions. The other factors are found and entered into another variant with n×number_of_factors as its dimensions. Since matrices contain numbers, the seller typevariable was entered as a binary entry, 1 for “For Sale by Owner” and 0 for “Dealership”.

An ordinary least squares regression is now done. The calculation involves the matrix operations transpose, multiplication and inverse. The equation is as follows.

β = (XTX)-1XTY
Yhat =X β
Where βis the coefficient matrix and Yhat is an expected value for the price given a set of factors.
The coefficients are then reported to the results page. The Yhat can be used for further analysis but is out of the scope of this program.

Figure 7: The Results Page

Matrix Operations.
Since the variants that contained the data were unsuitable for the matrix tools that come with excel (price matrix is an n ×1 matrix). It was needful to recreate these function that can handle the variants. This includes matrix transpose, matrix inverse and 2 functions for matrix multiplication, one for the n ×1 case and for the n×n case.  The matrix inverse operation uses the SVD class to use singular value decomposition instead of Gauss-Jordan elimination to obtain the inverse.

These include several helpful functions and subs that were employed numerous times throughout the project. It includes functions for removing certain characters from a string, array operations like array compliments, function to get rid of empty entries in arrays, a function to tell if an array is empty etc.

I learned that VBA programming can be difficult and time consuming. But it is also rewarding to see the creation of a new useful tool. I wish I had kept track of the hours but I think I have spent around 120 hours creating this program. I feel accomplished in that I can used this program to find a car with high value, and also have a great project to show potential employers.

I felt that my project used most of the major concepts covered in class. I used arrays and variants extensively. I created a userform which reacted to the event “change”. In the case of the Makes list box. I used error checking extensively, as well as both tools to interact with the internet. I used created my own functions. I learned to parse through 3text files to get the information needed to create the program. I created and ran the distance query for almost 3days to obtain all of the distance information (25,088 iterations). I used many nested loops, and conditional statements.

There were even minor concepts covered that I learned, but did not learn in class. Such as arrays cannot be passed as optional arguments, you have to use variants. I learned about converting and working with dates. This is used when I got the time that each ad was made. I had to use the date serial and date value functions to get an accurate time reading for recent updates.

I learned that it is far easier to use the web query wizard when getting information from the internet, than the agent. The agent is best used to submit information over the internet. I learned several debugging techniques such as the lines:

If i = 67 then
i = 67
End if

This is to call place a breakpoint in the middle of a loop when i = 67, and then stepping through the code to check for errors with the locals window open.

I learned that it is very important to break my code into small manageable chucks. So that different components can be called at different times. If there is one thing I would change I would try to do that even more. I learned to create testing subs to check my functions and other subs.

I would like to add even more features to this program such as email and text message abilities. I would also like to schedule reruns every so often so that my program can find cars with great value while I’m at work. I would also like to do more with the regression to study the residuals and test if the regression meets the assumptions needed for a regression and maybe automate a transformation of variables.

I eventually resolved all of my difficulties, but some of them had me stumped for a long time. Such as while using the agent sometimes a string literal would execute just fine, but a variable version would not. I after talking to Dr. Allen, we discovered that converting the variable to an integer or long and then right back to a string worked.

I had a lot of difficulties with the agent. I then realized that for some reason the a.explorer.goback method caused the internet explorer to go back but the text html that I was reading from would not go back while using that command. For that reason it became necessary to get the URL while I was on the previous page and then follow the link by text. In order to move back and have the html text update.

I received some assistance from Dr. Allen especially about using the agent. All I did consult the internet quite a bit for specific problems about functions and syntax I was not aware of. I did manage to find the SVD class online which allowed me to preform my own version of matrix inverse much more quickly than by doing the Gauss Jordan elimination that I learned in linear algebra class. However excel has a matrix inverse function called MINVERSE which works on multidimensional arrays. I however needed it to work for variants. So I used their code from Vanna. I also of course did not create the agent class.

Executive Summary
Car Regression

Rental Property Analysis

While I didn’t perform work for any official business, the work performed was to address a business need for myself and my father. About seven years ago we began investing in rental properties, at the rate of approximately 1 per year. Right now we currently have seven rental properties, and we are continuously looking for the “right deal”. As a “side-business” we do not have a significant amount of time to devote to rental property analysis. However, my father, a retired CPA, and me, an aspiring business consultant, have slightly different views as to the important parts to analyze with a rental property. As such, we both run various analyses, typically in Excel with different results based on a variety of factors. To combine our analyses, numerous spreadsheets need to be created. The task can be exhausting.

Therefore, to combat this exhausting task, I decided to create an easy to use rental property analysis spreadsheet. The user interface is such that it can be easily modified with each property analyzed. It includes a number of different evaluators from NOI calculations to sensitivity analysis based on the various yearly increases in rent.

The project has one centralized spreadsheet where all calculations are combined to give an overview to each rental property considered. Following the overview, there is a tab dedicated to each of the following: sensitivity analysis, purchase price information, financing assumptions, rental income assumptions, predicted yearly expenses, and finally a tab that will bring in current rates from

Based on user-entered information, the spreadsheet provides the return on invested capital over a 10-year period for each rental property. As a general rule, my father and I try not to invest in a property that has a predicted ROI of lower than 12% in the first year, and a minimum of 15% within the first four years. The ROI calculation is based on revenues, expenses, and acquisition information; all provided by the user through simple to use userforms. These userforms can be accessed from the ribbon under the real estate tab, or individually on each of the dedicated sheets.

Student Database

Executive Summary

My friend is a Middle School math teacher. There are standard tests the school district gives out so the district can track the status of the various math classes. However, this data is not well utilized. The teachers do not get any use from this data. These teachers must take time out of their day to give students this test and they would appreciate some tangible benefit from it. I designed a program that will allow teachers to utilize this database of information.

My program allows teachers to select a portion of the database, sorted by teacher and/or class. They can then see statistics for the chosen sample. This will allow them to evaluate how a particular class is doing, or a particular teacher. It gives statistics for the chosen sample such as average and standard deviation, and for the database as a whole, to allow comparison. It also breaks down statistics by gender, so teachers can see any potential correlations. A teacher can access her class and see how that class compares to the school as a whole. This will help the teacher know if they need to improve or if they are exceeding the school average. This project will also create a histogram. Numbers are more valuable if seen visually.

In addition to viewing aggregate statistics, the teacher can pull out specific students that are struggling and those students that are above average. This information will allow teachers to tailor their class to specific students, allowing students to get more individualized teaching. It could also be useful to identify students that might be better placed in a higher or lower level math class, or at least receive additional tutoring. Another function, allows a teacher to pull up a particular student to see how they are doing compared to their class mates. This would be a nice starting point for determining how a student is doing. For example, if a student were struggling it would be useful to know if that student’s poor scores are unique to that student or if they are ranked in the middle of the class. This would help a teacher understand if the problem is with the particular student or with an outside factor affecting the whole class.

In addition to these evaluation functions, this program also allows basic maintenance. There are user forms allowing teachers to search the database, edit entries in the database, add new students and enter test scores into existing students. These features are used to keep the database current.

Blog Archive