Probably the most time consuming and sometimes frustrating aspect of working with data from different databases is figuring out how to match this data and work with a similar data set. It is at this point you want to strangle end users who don't take the time to enter data correctly because crap data in means crap data out. We usually shoot to be able to match around 95% of the correlating data and then either have end users correct the input or create exception rules to correct for repeatable patterns. Most clients believe we spend most of our time figuring out their database design when in actuality this is much simpler a process than figuring out good data versus crap.
Monday, October 22, 2012
Saturday, October 20, 2012
New Project with Cameron
At the beginning of my son's school year (second grade) I made a deal with him that if he was doing well enough in math and reading (as well as behaving at school), I would begin to teach him how to write webpages and web-based applications. He has always been inquisitive when it comes to websites and the work Dad does on a daily basis. I figured what a great way to get him to do something educational while having fun and making sure he is on top of the technology that will surely form his future.
Well this year started off a little questionable with regards to sitting still and being quiet in class, but after the first couple of weeks, he has exceeded my expectations in the goals that we set for him. So tonight was his first lesson in website design and hopefully we will be able to spend at least a half hour each day on concepts and simple coding. Our first lesson tonight was what makes up the internet. The lesson went a little like Sheldon teaching Penny the basics of physics in The Big Bang. In the beginning...maybe not that far back...but close enough. Trying to keep the concepts simple enough with terms a seven year old can understand and with enough analogies to the real world to help him along. It was crazy on the whiteboard tonight.
We discussed the concepts of the cloud, host computers and client computers. I still don't think many adults in the professional world grasp this concept but Cameron and I will go over it many more times in the future and get more detailed on it. We also covered the engine (server app), fuel (the code), and output of power (the web page) in basic format. We covered a little bit of the coding languages available to us and the ones we would focus on to begin with.
We spent a majority of our time discussing what HTML does and how it is the basis for formatting a page. Gave him some real world examples of the basics such as underlining, bold (not the easiest concept to describe in words, much easier on the blackboard), fonts (will definitely need to focus on this more), color, and tables.
I thought that was going to be the end of the first day of discussion until I gave him some homework to complete for tomorrow. Started off by having him draw a simple webpage (without any real functionality besides layout and colors) to serve as a home page for his project on paper. Also he was asked to write out what his first real application will be (this will give us our end goal for this school year, have to keep reminding him to be patient and that this won't be complete for quite a while). During the first portion of homework, he started asking some questions and I noticed him trying to make it perfect the first go around. This lead into the second lesson regarding scribbling or brainstorming and the concept of rough drafts versus final drafts.
The most rewarding part of this project is seeing the excitement from Cameron on taking on this project. I am hoping this fuels even more interest in the reading, math, and creativity and hope he starts seeing applications of what he is learning in school. Time for me to find some sites that teach basic HTML to younger kids (any ideas out there?) so he has something to do if I am stuck working. Also have to start putting together some lesson plans in a binder of his work and the screen shots from the whiteboard for him to look back on. Not trying to make the next Gates or Zuckerberg, but as long he has an interest, I will definitely provide him the direction.
Any ideas out there for teaching younger kids on website design or real world examples that are close to website design like the car engine model I used previously? If anybody is interested, I can definitely post some of the materials and screenshots that we are creating and we will definitely post a link to his development site as soon as we get to that point.
Monday, October 15, 2012
Data Mining
We have been spending quite a bit of time over the past year extracting sets of data from disparate databases which in itself has been an interesting experience. But we have finally been tasked with helping to develop a real trending and forecasting tool to help predict future patient outcomes in a specific population. This is where the gathering of the data becomes fun and fulfilling. Finding patterns, matching data sets, interpreting lab results over long periods of time is what makes being in the information technology field an interesting challenge. Within the first week of spitting out initial results, we are seeing promising scenarios but we know much more fine tuning is going to need to be done.
Tuesday, May 24, 2011
Migrating Customer Applications to the Cloud
Recently, I acquired a new medium-sized customer with no on-site IT support. Previously they outsourced their IT services to a very capable group. But as I started working with this company, it became evident that management wanted to take a serious step towards minimizing internal services and IT management. In the past, I would have been hesitant to undertake this operation as I was not fully comfortable relying on a company in the cloud to host a business in its entirety. The organization was completely sold on moving their services and I wanted to make sure they had every need taken care, so we signed an agreement to get them moving forward.
I did my research on all of the hosted services available, but specifically looked at services for providing email and file sharing. There are many solutions available for providing email and most of them around the same cost. I looked at the organizations needs and we were able to quickly narrow the selection. As for hosting file services, this would be a little more tricky. Do the end users need access to files simultaneously? Will they be using a product like Quickbooks where multiple users may be entering data at the same time? How large are the individual files that the organization will be hosting? What kind of backup service is offered with the package? What kind of security (HIPAA/Sarbanes Oxley/FISMA) compliance would they need?
After sorting through the organizational needs, I decided on a solution and have gone through the process of creating the appropriate data shares and securing them to the company's needs (determining this was an undertaking on its own) and synchronizing data between the local server to the cloud. If you are to go through this process, I highly recommend buying a version of GoodSync so that both versions of the data are being updated while you migrate users over to the new service. One of the features that made our choice a little easier was the ability to use mapped network drives to the cloud making the transition easier for the end users. Definitely lowers the learning curve and they will be able to take their time learning about the other advanced features without affecting their production.
The end goal is to remove their internal servers, GoToMyPc applications, and the need for VPN services. The migration to outside services will be saving this organization almost 90% of their monthly cost for IT services/hardware. Current projections for this project put it in a two week period to accomplish these feats.
Soon on the horizon: getting into some of the data extraction/manipulation projects that FAST has been working on.
I did my research on all of the hosted services available, but specifically looked at services for providing email and file sharing. There are many solutions available for providing email and most of them around the same cost. I looked at the organizations needs and we were able to quickly narrow the selection. As for hosting file services, this would be a little more tricky. Do the end users need access to files simultaneously? Will they be using a product like Quickbooks where multiple users may be entering data at the same time? How large are the individual files that the organization will be hosting? What kind of backup service is offered with the package? What kind of security (HIPAA/Sarbanes Oxley/FISMA) compliance would they need?
After sorting through the organizational needs, I decided on a solution and have gone through the process of creating the appropriate data shares and securing them to the company's needs (determining this was an undertaking on its own) and synchronizing data between the local server to the cloud. If you are to go through this process, I highly recommend buying a version of GoodSync so that both versions of the data are being updated while you migrate users over to the new service. One of the features that made our choice a little easier was the ability to use mapped network drives to the cloud making the transition easier for the end users. Definitely lowers the learning curve and they will be able to take their time learning about the other advanced features without affecting their production.
The end goal is to remove their internal servers, GoToMyPc applications, and the need for VPN services. The migration to outside services will be saving this organization almost 90% of their monthly cost for IT services/hardware. Current projections for this project put it in a two week period to accomplish these feats.
Soon on the horizon: getting into some of the data extraction/manipulation projects that FAST has been working on.
Friday, October 2, 2009
A First Look at Azure in the Cloud
I spent some time yesterday at a Microsoft presentation for their soon to be released Azure Cloud Services. I came away with mixed feelings. Firstly, the concept of cloud computing is a promising foundation for removing hardware acquisition costs for an enterprise organization. The ability to virtualize a company's applications on virtually unlimited hardware resources will definitely benefit organizations where constant resources are not required for suitable application uptime. On the reverse, I do not see an increased advantage for users whose applications are not being utilized in a non-normalized pattern. With the advent of in-house virtualization, most smaller applications can achieve the same dynamic resource boosting as seen from a cloud computing environment without recurring costs.
Another big disadvantage to adoption of this technology is the inability to meet any protocol standards, ie data hosted on a cloud environment will not be able to be proven within HIPAA guidelines. Many markets will not be able to adopt the technology specifically for this reason. Both the medical and financial industries will find it hard to comply with any regulatory guidelines. It will work well for online commerce sites, but with the increasing governmental involvement, how long before regulations will limit the usefulness of holding data in the cloud?
As for Microsoft's foray into the cloud foundation, I came away both skeptical and intrigued by what they have put together so far. As with a lot of Microsoft products, they seem to be in their infancy in terms of offerings compared to competitors. They are promising a November release to the public, and I highly doubt that much needed technology will be included by that time. Foremost, the ability to manage the web and worker instances is nowhere near completion. The interface for managing the instance numbers is very simplistic, but can use a little work. The bigger disadvantage they have is not having a utility to show current performance versus increased performance of adding instances. Right now, it is purely a guessing game on how many instances will handle an application's traffic and processing requests. Microsoft definitely needs to develop a workload simulator in the staging environment that provides real feedback to determine instance usage.
Microsoft also needs to work on the SLA portion to allow users to automatically increase/decrease the number of instances based upon acceptable performance standards. How many of us want to wake up in the middle of the night because our application is being over-utilized just to update an XML file to deploy more instances for a two hour period? It would make more sense for the fabric controller to sense the utilization levels and increase to meet the demand up to a specified point, either a price point or utilization point.
That all being said, I look forward to learning more about the other cloud foundations at the Day of Cloud conference in two weeks. Hopefully I will be able to provide more feedback on the Amazon, Google, and SalesForce.com initiatives then.
Another big disadvantage to adoption of this technology is the inability to meet any protocol standards, ie data hosted on a cloud environment will not be able to be proven within HIPAA guidelines. Many markets will not be able to adopt the technology specifically for this reason. Both the medical and financial industries will find it hard to comply with any regulatory guidelines. It will work well for online commerce sites, but with the increasing governmental involvement, how long before regulations will limit the usefulness of holding data in the cloud?
As for Microsoft's foray into the cloud foundation, I came away both skeptical and intrigued by what they have put together so far. As with a lot of Microsoft products, they seem to be in their infancy in terms of offerings compared to competitors. They are promising a November release to the public, and I highly doubt that much needed technology will be included by that time. Foremost, the ability to manage the web and worker instances is nowhere near completion. The interface for managing the instance numbers is very simplistic, but can use a little work. The bigger disadvantage they have is not having a utility to show current performance versus increased performance of adding instances. Right now, it is purely a guessing game on how many instances will handle an application's traffic and processing requests. Microsoft definitely needs to develop a workload simulator in the staging environment that provides real feedback to determine instance usage.
Microsoft also needs to work on the SLA portion to allow users to automatically increase/decrease the number of instances based upon acceptable performance standards. How many of us want to wake up in the middle of the night because our application is being over-utilized just to update an XML file to deploy more instances for a two hour period? It would make more sense for the fabric controller to sense the utilization levels and increase to meet the demand up to a specified point, either a price point or utilization point.
That all being said, I look forward to learning more about the other cloud foundations at the Day of Cloud conference in two weeks. Hopefully I will be able to provide more feedback on the Amazon, Google, and SalesForce.com initiatives then.
Labels:
amazon,
azure,
cloud computing,
google,
salesforce
Sunday, August 16, 2009
Blogging and Marketing for the IT Consultant
I have been trying to find more ways to market my IT consulting business without spending a lot of capital. We have been very successful using the word-of-mouth and quality-of-service approach for the past 5 years, but I am always looking for new ways to gain footholds into businesses that need quality programming and architectural direction. A colleague mentioned that using blogging with social apps as a front end marketing tool is highly successful means of gaining a wider audience. I started this blog late last year and with the business being brisk, i find it hard to maintain a daily or weekly blog. So I figured out I would ask the community a few questions and determine if this is the best course of action.
Firstly, how long does one commonly spend updating their blog? How often are most executives updating their blogs, daily, weekly, monthly? Are most bloggers from a marketing perspective touting theirs wares and skills or are they giving input into general IT topics that are relevant at that time? And I think most importantly, what kind of return for new opportunities are bloggers seeing in the consulting field?
Firstly, how long does one commonly spend updating their blog? How often are most executives updating their blogs, daily, weekly, monthly? Are most bloggers from a marketing perspective touting theirs wares and skills or are they giving input into general IT topics that are relevant at that time? And I think most importantly, what kind of return for new opportunities are bloggers seeing in the consulting field?
Subscribe to:
Posts (Atom)