Friday, August 18, 2017

Build Automation using PowerShell



Now that we know why build automation is needed, in this detailed post I will cover how to approach build automation using PowerShell. Since this is a technical post I have included the list of acronyms and some useful links towards the end for reader’s benefits.

Background & Requirement

We use Microsoft Visual studio to develop our .Net projects. For a modestly sized client project with both web services and windows services, we were initially relying on manual approach to create daily builds and deploy it on QA, UAT and Production servers. On an average, this was taking around 2 hours of a DevOps Team member and more if some issues crop up. QA was getting delayed every morning getting the link for QA server with new build. We needed to automate this workflow as much as possible.
I was asked to undertake the task. Being a newbie to this, I did some research and found that doing Continuous integration (CI) means to use different tools and tool sets. Tools that I could use were Team City, Jenkins, Team Foundation Server, Bamboo, Circle CI and so on. Most were excellent tools supporting entire stages of CI but these also cost money in terms of licenses, either per user or per seat. Besides cost, there were also learning curves and their own limitations.
I have worked with PowerShell since its alpha release in 2005 and grew fond of its power and versatility over time. I use it for most of my automation needs, be it personal or work related. A relatively new windows command shell, it is integrated with C# which makes it much more powerful than traditional windows CMD shell. Much loved and rarely rebuked, it has very vibrant user community in System management and automation space. Just do a Net search for common System Management tasks through PowerShell.  Through Powershell Core , it is also making its way in Linux world. PowerShell is released with many inbuilt cmdlets to do most of day to day work and a lot of windows components & other vendors have provided their own cmdlets to manage their environments through PowerShell command line.

Justification & Business Case

Faced with licensing costs & other constraints of traditional CI tools and my own comfort with PowerShell, I decided to script the whole Build Automation workflow using PowerShell. It was a wonderful decision. Automation has been working very well since 2 years and saving minimum 2 hours per day for a small project with 5 web services and 3 windows services and code stored in SVN repository. Later on, we extended it to work with much bigger project with 11 Visual Studio (VS) solutions and 71 VS projects with code stored in TFS (Team Foundation Server). It saved 3 hours per day on the second project. We could deploy this on any windows machine where Windows Management Framework (WMF) 4.0 can be installed. Everyone on development team was aware of the code build status and DevOps could use output of this process to deploy it inside client VPN.

Creating Build Management Framework

For our Build Management workflow, I used the below tools, cmdlets and techniques
  1. Retrieve code from source control using SVN command line and TFS PowerShell
  2. Modify Solution files and Project files which are plain XML and easily manipulated from PowerShell.
  3. Use exe to download on-demand packages.
  4. Use MSBUILD and MSDEPLOY to build and deploy to desired location.
  5. Use IIS PowerShell cmdlets to manage Web services and deploy them.
  6. Use Out of the box PowerShell cmdlets to manage Windows services.
  7. Validate deployments and do BVT using Invoke-WebRequest, Invoke-RestMethod and other out of the box cmdlets.
  8. Invoke our test suites written in TestNG or nUnit directly.
  9. Use Send-MailMessage to send mails with results and the attachments. Securely save Sender password in system through PowerShell Secure Capabilities.
If I had been automating build created in other environment like Java, Maven/Ant & Git, I could have easily installed these on my build machine and use the corresponding command line to get code and build it.

Major Benefit of the PowerShell based approach

Major benefits of using PowerShell to do Build Automation as per me are as follows
  1. Easy availability and No Licensing cost – All things mentioned here are available free of cost on any windows computer. There is no ongoing cost beyond initial development cost.
  2. Easy deployability – This setup can be easily deployed anywhere on a windows computer and it can start working without much hitch.
  3. Easy maintenance – Once used to tweaking the configuration files required for it, even a novice DevOps engineer can maintain this. If DevOps team is well versed with PowerShell, they can even debug and update the scripts as and when needed.
  4. Modularity – I divided my solution in 4 parts, SVN fetch, Solution Modification, Build, Deploy. Most of the code in this is easily reusable across various .net projects.
  5. Longevity – Microsoft is committed to enhance PowerShell experience across the board, as also many vendors. Investment in this is not going to be wasted down the line.
  6. Existing knowledge – Most DevOps engineers have some exposure to PowerShell and can utilize their knowledge very easily in this rather than learning a new UI tool.
  7. Full control – PowerShell cmdlets and other command line tools usually expose more information and allow more fine-tuned control than any UI based tools which always need some scripting support.
With PowerShell, immense power is available with a DevOps engineer to tweak things as needed. Why run around and struggle with various evolving tools when everything is available easily to do most of the build automation tasks through out of the box tools?
Added advantage of a build automation project undertaken through PowerShell would be major knowledge upgrade for the DevOps team. They will be exposed to many tools & concepts and become more nimble & productive with effective usage of PowerShell in their day to day life.
Cmdlets available from others and coming with new software installation

As I noted earlier, many vendors have extended support for PowerShell through their own PowerShell cmdlet packages. I will list out few now which do not constitute an exhaustive list by any standards but gives a glimpse of what can be done through PowerShell from a DevOps architect perspective. If your workflow constitutes more steps than mentioned earlier, you may need to use one of the below.
  1. AWS tools for Windows PowerShell – Manage AWS services from the Windows PowerShell scripting environment.
  2. Azure PowerShell
  3. SQL Server PowerShell – SQL Server 2017 supports Windows PowerShell. PowerShell supports more complex logic than Transact-SQL scripts, giving SQL Server administrators the ability to build robust administration scripts.
  4. Oracle Cluster and PowerShell
  5. Net and Data Access – Connecting to Oracle Database from PowerShell.
  6. PowerShell for Docker – Under construction now, under open source project but very promising.
  7. Manage VPN Connections through PowerShell – If needed; connect to VPN before code download or deployment.
  8. Manage Windows Clusters through PowerShell
  9. Microsoft Office PowerShell cmdlets – Automate editing of Office Files.
  10. PowerShell in Jenkins – Use PowerShell scripts in Jenkins.

Links, Acronyms & Further readings

  1. PowerShell learning from Microsoft Virtual Academy
  2. Continuous integration – A development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.
  3. Windows management Framework – PowerShell upgrade
  4. List of Build Automation tools
  5. Some CI tools – Team City , Jenkins , Team Foundation Server , Bamboo , Circle CI
  6. SVN – Apache Subversion
  7. TestNG – TestNG is a testing framework inspired from jUnit and nUnit but introducing some new functionality that make it more powerful and easier to use.
  8. nUnit – NUnit is a unit-testing framework for all .Net languages
  9. QA – Quality Assurance
  10. UAT – User acceptance testing
  11. Cmdlets – a lightweight command that is used in the Windows PowerShell environment. The Windows PowerShell runtime invokes these cmdlets within the context of automation scripts that are provided at the command line.

Monday, August 14, 2017

Why use Build Automation in Application Development?


Growing relevance of automation & DevOps has revolutionized the software engineering industry and made a deep impact on the way traditional application development is approached. With all the hoopla around it, one thing is certain; it is here to stay with a long list of benefits.  In this blog, I take a look at why Build Automation is needed in contemporary software development projects, but before that let me quickly cover the basics and answer what exactly it is.
What is Build Automation (BA)?
BA also sometimes referred as Continuous Integration (CI), is the process of automating on-demand build creation which encompasses some or all of the below
  1. Download code from a central repository – Git, SVN, TFS etc
  2. Make updates in code structure if needed
  3. Download required external packages via Maven, Nuget, Ant etc
  4. Build code using gcc, javac, MSBuild etc
  5. Create a building share with binary and default configuration – Jar, war, exe, XML, ini etc
  6. Propagate build output to cloud or network shares
  7. Do deployment on web servers & other servers
  8. Configure new or upgraded deployments
  9. Do BVT tests deployments
  10. Inform relevant stakeholders
A CI is triggered usually when a code commit is done or a particular tag is created. A Basic BA job is usually triggered at a fixed time; Dev teams need to finish commits by that given time.
Continuous integration Vs. Build Automation
CI’s benefit lies in giving every team member responsibility for individual commits. Faults are uncovered fast. It’s a complex process even with a licensed software or service and needs good skilled DevOps team. Despite claims of only configuration based settings, some scripting always needs to be done.
In contrast, basic BA takes the time to uncover faults but its predictable timeline reduces anxiety for Team members. It’s easy to implement leaving few manual tasks. It can be developed by anyone with basic scripting knowledge as I will demonstrate in a later post. It can be done using the native shell of an OS without any licensed software.
Hesitation about doing Build Automation
Due to the fact that basic BA may skip some manual steps, many think that it is not worth it. They aren’t helped by the lack of enthusiasm on the part of DevOps or the Dev teams. DevOps teams may think that their job is in danger.
Dev teams aren’t very enthusiastic about the need of their time. With every new technology, new ways of building code and organizing it come around. DevOps teams will not know all nitty-gritty about new build systems. Nuget.exe system may not be very clear to a DevOps person with Linux background. Git brings its own peculiarity in dealing with repositories. Dev Lead has to be really serious about helping their DevOps counterparts during automation development.
Why is hesitation not right?
Unlike CI, BA can be done economically, in less time and gives below benefits
  1. Discipline in team members – Initially, Dev teams complain about frequent build breaks but with a right push by PM, they will inculcate better habits.
  2. DevOps time-saving –They need not stay late in the night or get up early to finish daily build.
  3. QA time-saving – QA need not wait to get the new deployment before starting testing. In the case of build breaks or BVT bugs, a turnaround is faster.
  4. Management visibility – Management can uncover productivity of developers looking at build emails. Many build breaks can initiate improvement in the quality of Dev teams.
  5. Predictable clean build – Manual builds typically are incremental builds which may hide build problems.
  6. Predictable clean deployment – Manual deployments can take dependencies on deleted configuration. Automation can do fast clean installation uncovering broken settings.
  7. Wide Dissemination – More stakeholders can be kept informed by automation.
  8. Knowledge improvement for DevOps
  9. Retention Tool for DevOps – When DevOps teams are doing higher quality work, they will be more inclined to stay on and learn more.
Most benefits mentioned are for any automation project while some are unique for a BA project. It is imperative that any project with >3 developers and >3 months of development time should do BA. With the time, teams can build reusable components for BA projects.
Stakeholders in a Build Automation Project    
DevOps
DevOps team must be proficient in the native shell of an OS; PowerShell, Bash etc. They can additionally learn cross platform scripting languages like Python. They need to be well versed with command line Git/SVN/TFS instead be aware of basic development methodology.
Development
DevOps team will never know everything about a dev system. Dev manager must provide relevant help whenever needed. They also need to keep dev team members informed about requirements of build automation which may include
  1. Commit the only unit tested code
  2. Commit in time
  3. Commit with relevant messages
  4. Maintain code quality
  5. Maintain configuration file quality
  6. Use relevant naming convention in configuration files and code
  7. Help quickly in case of build breaks
QA
QA teams are the biggest beneficiary of this exercise so they need to be very pushy upfront to get BA going. They should help DevOps with a list of BVT tests to validate test deployment and provide with reliable automation.
Project Management
BA Project must be driven by PM team. It’s an essential part of their repertoire to be aware of the challenges and modalities involved.
What’s next?
In the subsequent blog posts, I will cover different aspects of Build automation and tools using different use cases. Be on the lookout for the same and all the best for Automation projects. Do reach out to us in the case of any help needed in your process and we are sure to be a force multiplier for your requirements.

Sunday, August 6, 2017

5 Key Questions to Ask When Evaluating BI Tools

BI Tools Evaluation Criteria - Sigma_Infosolutions


Geoffrey Moore, an American organizational theorist, management consultant, has rightly said: “Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway.” While the market has a growing variety of vendors for BI tools due to technological advancement, every tool is distinct in its own way. The overall picture may be comprehensive and a flashy demo may convince you that it is the right one, but to pick wisely businesses have to remember that the subtle differences between the tools are the main markers to be carefully considered.

To evaluate these differences, one has to first understand the business requirement and question the intricate aspects to get a better insight into the tool’s hidden limitations. Any tool that you pick should primarily be able to organize the surplus data and aptly generate business charted analysis or reports. There is no one model that can be used like a “super brain” to comprehend the business requirements and generate data on its own will.
Some of the basic questions to address before zeroing would be:

1)    Is the tool shallow in collating data?

Allow the business requirements to predetermine the tool’s function because a tool is an aid to the business and not the other way round. As most BI tools just look at the individual organizational silos and tend to miss out on collaborating coherent information, an ideal BI tool should collate information from the entire business operation, analyze the core functional areas, compare data from multiple types of ERP and external data sources, and then generate a consolidated analytical model that involves all the intricacies.

2)    Does the tool generate a mere report or engage analysis? 

Businesses, most often, fail to see the difference between a summary report and an analytical report. The generated report should be able to map through all the different sectors and at the same time, the collated information should not be just a data mesh. Well, sorted data generation should be the key component for proper functioning in a BI tool.

3)    Is the data current or time-stale? 

The tools used in any form of business should generate data that is updated to the current numbers. If this is absent, then the data generated would be just figures of the past rather than the present. The tool should be well equipped to spontaneously downsize essential data in order to ensure that the business stays in the competition and does not get backlogged.

4)    How fast is the tool and how flexible it is? 

An effective BI tool which can turn out spontaneous reports with the collated information is a major advantage for proper projection of growth and damage control in an organization. Any tool that takes days to churn out information will be of no use to the business. Besides, just like technological evolution, the tool should be technically adaptable to the market trends as tools become easily outdated within few months at times.

5)    How soon can the BI tool be put into play and is it an all in one package?

While most organizations pick ready to use services, building custom BI tools should be quick too; as otherwise, the business would be overlaying progress without the requisite projections. Also, there is no one tool that fits the ideal package. Hence, businesses can try out trial runs and then pick one that fits their parameters and tweak the little details through their IT department in order to ensure that their needle in the haystack is sorted out.

Have you evaluated your BI tool on the above criteria? What are some primary factors you consider while evaluating Business Intelligence tools & services? Sigma’s BI services ensure flexibility on BI tools which are suited best for your business. Do leave your thoughts in the comments section below.

Friday, July 28, 2017

AI in BI – Intelligence,The Way Forward


Traditionally, business intelligence (BI) was restricted to business analysts who supplied information based on a collection of data over set time periods. The evolution of data and the collection of real-time data has greatly influenced the structuring of BI trends. The speed of data is imperative to drive timely actionable insights. Data that had been the metrics a day back would become stale within the next few days. Consequently, live access to data and its immediate interpretation has become the core of BI models. Like data, BI models too have started changing constantly with time bridging the time gap between data gathering and analysis.

The dawn of digitalization

The metrics of digitalization, consumerization, agility, security, analytics, cloud, and mobile are also simultaneously influencing the changing landscape of BI. One of the revolutionizing ideas that are taking form for better BI control is Artificial Intelligence, the AI. This has become a new face in the BI space as real-time data crunching has become more demanding for second by second analysis. Using the evolution of built in algorithms and age-old data analysis tools, businesses could build effective models through AI. It makes the data not just live but also visualized for effective analysis.

Current tech landscape of AI
The purpose of business analytics is to answer and project what the future holds. Artificial Neural Networks (ANN) and Autoregressive Integrated Moving Average (ARIMA) are two common techniques that are enabling better BI under predictive analysis. While the ANN models work just like the neurons of the human body in trying to chain the data into visualization, the latter technique, ARIMA, is concentrated on time series analysis that predicts scenarios by synchronizing both the past and current data.

Besides providing real time data analysis tools, Artificial Intelligence (AI) is indeed engulfing the business intelligence. We have witnessed several business modules incorporating AI models for efficient functioning and success of their business models. It would be safe to say that some of the areas where AI has been quite successful would be in sales, general electric companies that deal with intricate repairing of machinery, hospitals and in certain cases to monitor machine fleets and factories. If we are not convinced with the fact that AI is slowly taking over BI – here is a fact for you – AI is now the new decision maker! If you are looking for a smart business partner, you know who to reach out to next! This brings a rather intriguing quote from Woodrow Wilson, who has stated: “We should not only use the brains we have but all that we can borrow.” Ain’t that quaint?

The way forward

We have seen upscale in BI technologies such as cloud analytics and embedded integration systems all through the year 2016 and they will continue to reign the BI world since smaller businesses are still in the process of shifting gears into bigger technology. The year 2017 has been predicted by business analysts as the year for businesses to start migrating into the technologically advanced BI models.

Here is a million dollar question – Are you leveraging BI to your strength?

Drop a comment, and I’d be happy to discuss the future of BI with you.

Wednesday, July 19, 2017

Why you need Newsletter Subscription Feature on Your Magento 2 Store?

One line answer for this question – As a default, Magento 2(CE) doesn’t have one and you need to increase your customer lifetime value! Agree completely? You may want to check out this Magento 2 extension.

Interested in digging deeper? Read on.

So you are all set up on Magento 2 and have a kick ass digital marketing strategy in place which ensures that your potential customers find you when they need to buy. Now what? You wait for the customers to come pouring- in on your online store and have transactions. But, is a transactional relationship with your clients is what you are aiming for, at best?

A true salesman knows the value of establishing a solid relationship with its customers. Increasing the lifetime value of your customers is the ultimate aim you should be looking for. The good old Pareto principle holds completely true here which mean 20% of your total customers are responsible for 80% of your revenues.

Customer Lifetime Value 

Customer Lifetime Value(CLV) refers to the future profits a company expects to earn from a customer throughout his or her relationship with the business. A repeat customer thus is much more valuable to any online merchant than one time shoppers. But how do you ensure that a continuous communication channel is built with the customer? Digital marketing tactics allow you to promote and re-market to your existing customers on the web but at the risk of being too intrusive and some case even on the verge of being annoying. Remember the time you felt when you kept seeing an unwanted digital ad of a particular product you just stumbled upon on an online store?
This is where email marketing is slightly better as you have the prior permission to connect with the customers. This bring us to an important question-

Does Email Marketing still work?

Among numerous innovative ways of establishing that intimate relationship with your clients, traditional email marketing still is one of the most effective & least intrusive ways of keeping in touch with your customers. In fact the engagement rates through emails are improving in 2017.

Why you need customers to subscribe?

Subscribing means you have a ready database of your actual customers who want to hear more from you. We don’t need to stress on the fact that how valuable that database is to your firm. It will ensure that your customers are aware of the best offers, company news and a ton of other relevant info right in their inboxes.

How can you enable this feature in Magento 2(CE)?
As a merchant using Magento 2 Community Edition (CE) you can either ask your store maintenance provider to help you add this feature, which might be a taxing and time taking process. Alternatively you can buy any off- the-shelf extensions at fraction of the costs on Magento marketplace.

Which extension to use?

You can try Newsletter Subscription at Checkout extension for Magento 2(CE) which is super-easy to install & configure. As a merchant you will get a zipped code with user manual & installation guide for a simple step-by- step installation process. Most valuable part of this extension is its highly customizable feature which gives you the flexibility to choose what you need.

In conclusion, we would like to stress on the fact that email marketing can be a great addition to your marketing mix but you need to ensure that you have laid a strong foundation for it by building a subscriber list. This may seem a minuscule step but adding the feature of subscribing to a newsletter on the checkout page will make all the difference when it comes to building lifetime value for your customers.

Tuesday, May 16, 2017

Advanced Template Hints for Magento: Extension Review

Is better development and debugging of your Magento store a top priority for you? As part of our Magento practice we get hands on experience on various extensions which can help in optimizing Magento store performance.  Advance Template Hints is a highly recommended tool for Magento Developers which can assist in meeting your goals.


One of the most useful features in Magento is finding the templates and blocks behind each section of the page. But sometimes path hints are confusing and it is difficult to check all the details of a block. In addition, only blocks of the type “Mage_Core_Block_Template” are displayed here but not all block types inherited from “Mage_Core_Block_Abstract”.
With ‘Advanced Template Hints’ extension, the template block information is displayed in the upper corners of each block. All blocks are outlined in red, green or yellow to indicate whether it is being cached or not.

As with the normal template hints, some boxes may overlap and not be visible properly, but you get the following information:
Module: Name of the module from which the block is constructed
Ex.: _Module: Mage Core , _Mage_Cms _oder _Mage Page
Path: The path of all blocks which are nested in the given block
Example: PATH: _Mage_Core_Block_Text (alias / name: topLinks) <- Mage_Page_Block_Html_Header (alias / header) <- Mage_Page_Block Html (name: root)
This means that the block with the name and alias “topLinks” of type “Mage_Core_Block_Text “Is contained in the block” header “of the type” Mage_Page_Block_Html_Header “, which in turn is contained in” root “of type” Mage_Page_Block_Html “.
Template: If a block from the Mage_Core_Block template
class is used, the template used is also specified. Example: TEMPLATE: frontend / base / default / template / page / switch / languages.phtml_
Cms block: If block is of type “Mage_Cms_Block_Block” (ie a normal static Cms block) then block id is specified.
Eg: _CMS-BLOCK-ID: footer links
Cms page: If block is of type “Mage_Cms_Block_Page” (ie a Cms page) then the pages Id is specified.
For example: CMS-PAGE-ID: home
Cache: And finally, the new template hints about the caching of the block:
Ex: _CACHE: Lifetime: forever, Key: c7e582f7a3b1b41fd5cd10c492c2ee13c60bae44, Tags: store, cms_block, block html
Here the Lifetime, the cache key and the set tags are output.

The block is also surrounded by a dotted line. Three different line colors are used here in order to be able to distinguish the blocks quickly from one another:
  • Red: The block is not cached (the cache Lifetime is set to “zero”)
  • Green: The block is cached (Details on the caching parameters can be read in the title tag.
  • Yellow: The block is not cached but is contained in a cached block and is implicitly cached.



Uncached Block (red border)



Geocacher Block (green border)

Implicitly cached block (yellow edge)
Note: This extension can also be integrated with PHPstorm by installing “Remote Call” plugin
Working for various Magento projects has enabled us for an intimate understanding of many extensions and third party tools which adds to the efficiency for maintenance & development for a Magento stor. I am eager to hear your thoughts and experiences on this front in the comments section below.

Tuesday, April 1, 2014

Three Reasons to Choose Grails for Mobile App Development

Grails is a web app development program which was earlier named as Groovy on Rails. Groovy is the dynamic application programming language built for Java Virtual Machine (JVM) and the Grails development framework uses the Java Development Kit (JDK). JDK is the principal programming language in Grails. The most beneficial feature of Grails/Groovy is that the framework as well as the platform can be run alongside Java and programmers can make suitable changes in the development environment according to their specific requirements. When it comes to mobile application development, Grails/Groovy now invariably becomes a choice of the developers.

Why Grails/Groovy has become so popular around the world within a short period of time? Why Grails is now considered the most useful and most agile framework for mobile app development? We will explore the beneficial aspects of Grails mobile app development. However, for the more inquisitive and enthusiastic developers, Groovy is an open-source language which is licensed under Apache 2.0 and is reposed on proven effective OSS or Operational Support Systems that include all the standard plug-ins including Hibernate, Spring and Jetty. Grails is considered great for mobile app development for several reasons and developers often cite some common beneficial aspects of Groovy/Grails including the Convention over Configuration (or coding by convention) idea that is underlying, the DRY or Do not Repeat Yourself Approach and agile app development environment. Here below you will find three solid reasons to choose Grails development tools over other app development frameworks.

1. Faster to start a project from scratch: As previously said, the Grails framework for mobile application development follows the coding by convention standards. As a result, you need not to build configurations every time you start a new project. You can, rather, invest your time and other resources in the core application coding and programming. This is undoubtedly the most advantageous feature of Grails and here only it outdoes many other Java based frameworks. Grails is surely the most agile web application development framework at this moment. Developers can invest time in R&D if Grails are used for app development.

2. Full utilization of Java: Grails and Groovy can be integrated with other Java applications quite easily. This is why the mobile app development framework is considered fully compatible with Java and the developers can easily use Grails alongside with one or more Java based frameworks. Not only that, the process also becomes shorter and more streamlined. Developers can access and fully exploit the dynamic Java libraries that are integrated with Grails.

3. Don’t Repeat Yourself principle: Grails developers offer agile solutions as because they are equipped with one of the fastest web app development framework of this type. Grails follows the DRY principle and facilitates the developers to accommodate the new changes in their codes. As the developers do not need to change the codes time and again, it significantly reduces the delivery time.

There are quite a few companies now in India and elsewhere that offer Grails based mobile app development solutions. You can just look up the web to find a competent offshore solutions provider.

Sigma InfoSolutions is one of the early adopters of Grails/Groovy. The Grails web service provider is now offering bespoke application development solutions to offshore and inshore clients. For additional information, please visit Sigma Infosolutions or write them at grails@sigmainfo.net or call the Helpline at +1-949-705-6980.

Software Development Blogs - BlogCatalog Blog Directory RSS Search Technology Blogs - Blog Rankings Blog Directory