February 22, 2013

Why the MPS File is Still Useful

MPS file

The Mathematical Programming System (MPS) file is a relic of the IBM MPS/360 system of the 1960s, which is an extension of the SHARE format which preceded it.

The SHARE format for linear programming (LP) problems assigned a 6 character name to each row and column and was limited to 12 characters of BCD data for each data element. The MPS format extended the row/column names to 8 characters (because 8 characters made up a double word on the 360 architecture) but otherwise was little different. They were both also column-oriented – that is the non-zero coefficients were grouped together in the matrix by column rather than by row. This was a consequence of the origin of commercial mathematical programming systems; heavily influenced by the formulation practices of the petrochemical industry, whose practitioners thought of LP columns as “activities” with coefficients representing inputs and outputs. We are obviously talking about ancient history here.

So, why is the MPS file even of interest now, never mind in wide-spread use? Several reasons:

  1. The format has been generalized by many software vendors. The coefficient formats, in particular, are now free format, rather than confined to particular field lengths.
  2. The MPS file, generalized or not, is a valuable medium of exchange, which almost every MP user, whether academic or industrial, can read and understand.
  3. The major repositories of LP test problems (e.g. Netlib and Mittleman) have many, many test problems in MPS format, and are likely to continue to do so.

This is not to say that more modern formats are excluded. For example, the GAMS library of examples includes a host of models written in more modern, equation-oriented model generation format, as does our own example set in FICO Xpress download package.

It is difficult to see this situation changing. There are many model generation systems, each wedded to its own modeling language and data structures, but the standard test problems seem to go on forever and they are all in MPS format.

January 25, 2013

Advances in Algorithms and Hardware Help Speed Optimization

Algorithm.jpg

The relationship between algorithmic developments and hardware improvements has led to the immense speed-up in optimization codes over the last few decades. There has long been a debate over which improvement has done the most for forward progress. Hardware developments seem to have taken two main routes: the supercomputer and at the lower end, the multi-processor, multicore chip.

The supercomputer has become even more “super”, but the special algorithms once developed for vector processing are no longer used. However, we should not ignore the role played by algorithm development in the recent past. Without it, the vector and parallel supercomputers could have conferred almost no advantage on linear programming (simplex) based methods. The one bright spot was in the implementation of “embarrassingly parallel” algorithms for solving discrete problems, such as mixed integer programs, by branch and bound or its relatives, often achieving super-linear speed-up with the number of processors.

The situation at the low-end has been different. It is now possible to solve some test LPs in a few seconds, which once took over an hour on a main frame. Many of the algorithmic advances in sparse matrix methods were published decades ago, but their efficiency has been dramatically improved by rules such as “never touch a zero”, and employing computer science concepts such as heaps. However, hardware has played a prominent role, and not just through Moore’s law. The basic laptop or desktop now has multiple processors, each often having multiple cores, in addition to multi-level caches, plus the speed-ups that occur over time. Thus hardware seems to be currently winning the speed race. Even so, studies suggest that over time, say 20 years, the speed-ups have been quite evenly divided between mathematics/algorithms and hardware improvements.

What of the future? It is difficult to foresee fundamental new algorithmic developments in basic LP and sparse matrix methods, but some areas of optimization which were under-developed are beginning to come into their own. One of these is non-linear programming (NLP), which after years in the shadows (computationally, that is – it is prominent in the literature) is starting to become ready for prime time. Some hardware developments – more cores for example - are easily foreseeable, but will there by new breakthroughs, and the algorithms to exploit them? Time will tell.

January 15, 2013

The Science and Art of Decision Management at a Large Bank

FICO-case-study-button-lg

IDC just published an in-depth case study demonstrating how a large bank used FICO Blaze Advisor as a base for their decision management capabilities. Here are some key take-aways from this interesting use case:

Industry: Banking

Company: Large bank (anonymous)

Business Challenge:

With rapid geographic expansion, new product lines and a significant increase in the number of loan applications, this bank needed to deploy new technology to ensure that the operational efficiency of the lending process was maximized. The goal was to use the technology to enable the local credit manager to make better (faster, more insightful, and more consistent) decisions.

How FICO Blaze Advisor Helped:

  • Flexibility to change rules on demand
  • Scalability to move from hundreds to thousands of rules
  • Business user functionality to quickly change relevant rules instead of hard coding them on the underlying operational system
  • Availability of audit trails for policy changes which enabled internal policy review and compliance reporting
  • Real-time risk assessment of online customers using smart application forms
  • Mass personalization of customer interactions based on interactive application forms and intelligent assistance for product selection
  • Automated workflow, ease of deployment and maintenance and improved management reporting

Results: Blaze Advisor enabled the bank to react faster to loan applications and with greater insight to changes in lending norms. In addition, the bank improved productivity through credit process automation via the deployment of automated scoring models for risk assessment.

Deploying an initial set of rules is only a portion of any decision management program. There is a need for ongoing review of rules and the rapid deployment of new or modified rules. This bank took Blaze Advisor and made it the foundation for all decision management. It was leveraged across multiple decision areas and prompted a true culture shift to embrace automation.

View the complete IDC case study.

January 02, 2013

The Future of Big Data is Here

Big_data_infographic

OnlineBusinessDegree.org recently created an interesting infographic to help explain the future of Big Data.  As one of the hot buzzwords for 2012 it seems a fitting way to start the New Year.    

I like the section explaining how Big Data can be useful:

  • It creates transparency that can be used to increase efficiency.
  • It allows for better analysis of employee and systems performances.
  • It can replace and support human decision making with automated algorithms.
  • It creates innovative business models, products and services.
This is helpful guidance as many companies struggle to figure out how to get value out of their Big Data.  With the right analytic and decision making tools, Big Data can be transformed from an unmanageable collection of raw content into valuable insights into how people behave, what they may buy, and how they are likely to respond to specific offers.  This should be an essential component of every business plan. With predictive analytics and decision optimization, companies can use these insights to better predict future customer behavior and adapt their offers and actions accordingly to deliver the right deal to the right customer at the right time within the right context.  When done correctly this results in happier customers and increased profits that also please shareholders.  And that’s a sure way to start the new year right.

See full infographic.

 

December 19, 2012

Effective Use of Data and Analytics Can Increase Profits

Auto-loan-car-financing-application

In the November/December issue of Non-Prime Times, the National Auto Finance Association’s official publication, FICO’s Ken Kertz describes six strategies that auto lenders can use to win more business:

  1. Frequent model assessments – Health checks help determine the effectiveness and accuracy of your models.
  2. Custom scores –Application data in combination with credit bureau data can better assess risk.
  3. Risk category segmentation – More granular segments enable you to offer more precise pricing and respond to competitive threats.
  4. Dynamic pricing – Optimization software can be used to achieve business objectives while balancing constraints.
  5. Automated policies and strategies – Business rules can improve consistency, reduce error rates and increase customer loyalty by enabling accurate decisions.
  6. Customer intimacy – Understand your customers and treat them as individuals in order to make the right offer at the right time.

These strategies align with top technology trends we’re seeing today around using Big Data, analytics and optimization to develop more intimate customer relationships.

Read the full article here

December 11, 2012

The Complexity of Sports Scheduling

NBA_schedule

The NBA recently fined the San Antonio Spurs $250,000 for resting four of their marquee players in a game against the Miami Heat. The Spurs coach stood by his decision stating this was the sixth game of a road trip and it was also was the Spurs’ fourth game in five nights. The NBA commissioner argues that he needs to protect the league and the fans by having teams put their best product on display in a prime-time televised game.

Sports scheduling can be very complex and needs to balance a number of conflicting goals. During the regular NBA season, 30 teams, who play around three games a week from October through April, need to be scheduled. The league must balance income from ticket sales and television deals with the cost of player salaries. They must also take into account the potential damage this demanding schedule can have on the athletes’ bodies and how to distribute home games and road games.

FICO’s own optimization software is used to schedule the NFL’s 256 games, which involves trillions of scheduling permutations, 20,000 variables and 50,000 constraints. We know how hard sports scheduling can become given the large amount of data and the complexity of the variables that must be considered. Without knowing all the details of why the NBA has chosen this particular schedule, it seems that this fine showcases a need for a better NBA schedule next season.

Sports organizations that are able to leverage the power of analytics will have a significant competitive advantage and can offer a superior experience for fans.

November 07, 2012

FICO Success Snippet Featured by Gartner

Targeted marketing
Gartner analyst Jim Sinur picked up on a recent FICO success story featuring Blaze Advisor Business Rules Management System for context sensitive, event based marketing. As Sinur points out, customers now expect companies to understand their preferences and market accordingly with highly relevant offers.

Business Rules Management plays an important role in understanding what action to take when an event occurs and to coordinate the delivery of consistent messages across every touchpoint.

Check out his blog post for the full story.

November 05, 2012

Analytics and Big Data are Key Technology Trends for 2013

Gartner2012

I attended the Gartner IT Expo in October to interact with senior IT leaders, showcase FICO’s products and get the latest take on Gartner’s upcoming trends in the technology industry. I found their talk on the Top 10 Strategic Technology Trends for 2013 very interesting. Here is their list of what to look for in the next year:

  • Mobile Device Battles
  • Mobile Applications and HTML5
  • Personal Cloud
  • Enterprise App Stores
  • The Internet of Things
  • Hybrid IT and Cloud Computing 
  • Strategic Big Data
  • Actionable Analytics
  • In Memory Computing 
  • Integrated Ecosystems

Mobile and cloud were hot topics, as expected. I was heartened to see that Actionable Analytics made the list. FICO has been evangelizing the power of analytics to improve decisions to help companies increase profitability and efficiency. Gartner’s message is very aligned with this and believes that companies can “perform analytics and simulation for every action taken in the business”. 

This goes hand-in-hand with Strategic Big Data, another item on the list. Companies will be mining their large data sets to learn more about their customers and deliver a superior customer experience and relationship. The key to getting value from the data is to use analytics to uncover predictions and improve decisions.

Gartner’s top ten lists are always worth noting as trend indicators. We are excited to see how these trends unfold throughout the coming year.

October 12, 2012

Celebrating 40 Years of Forrest-Tomlin

Simplex_algorithm
This year marks the 40th birthday of the Forrest-Tomlin Method, which was implemented to speed up the simplex algorithm for early computer architectures. As many Operations Researchers know, this method is still applicable as one of the fundamental technologies in optimization software.

 One of the cornerstones of linear programming is an efficient update of the factorization of the working basis. Many different methods have been tried for implementing this critical function, from the classical product form update, which imitates the steps of Gaussian elimination, to the Bartels-Golub update, which was the first to maintain LU factors directly by partial pivoting. In addition, we have Reid’s method of symbolic sparse updates, Saunders’ method of spike reordering, and Fletcher and Matthews’ work with the Schur complement.

It is not an exaggeration to say that almost all modern implementations are based on the method first devised by John Forrest and John Tomlin, and published in 1972. It combines much of the speed and ease of the product form update with the increased sparsity of the Bartels-Golub update, and was a significant improvement on both. Although many variants have been explored since – Suhl and Suhl’s sparsity exploiting update is a recent example – the basic method remains highly competitive; a remarkable achievement given the changes in computer hardware over that period.

Learn more about the history and implications for today’s implementations at this year’s annual INFORMS meeting. Operations Researchers and Management Sciences specialists will also learn about recent developments in the field and the underlying tools of the profession, such as FICO® Xpress Optimization Suite. Stop by the FICO booth or attend one of our many sessions to learn more.

 

September 21, 2012

Decisions in the Cloud

Cloud_computing
Many companies are thinking about ways to take advantage of cloud computing.  SulAmérica is turning those thoughts into action. As Brazil’s biggest independent insurance company, they needed a better way to connect with their 30,000 brokers throughout the continent in order to make more consistent, flexible underwriting decisions. SulAmérica will use FICO® Blaze Advisor® , hosted on Google’s cloud, to easily manage and change business rules while keeping hardware costs low.

This is a great example of the benefit of putting decision services in the cloud. The high volume transactions associated with decision services make the cloud ideal for keeping IT costs down while providing flexibility and capacity to handle spikes in volume. And in a world where companies can no longer wait months to deploy changes to their decision services, the cloud is a way to circumvent the IT backlog and get to market faster.

So what does it take to put decisions in the cloud?

  • Cloud Flexibility – every cloud vendor has a different set of capabilities that are guaranteed to change over the next few years so you need a flexible set of choices that can adapt to the needs of your particular cloud. In addition, it is important to design your decision services so they can move freely between different providers and between in-cloud and on-premise.
  • Decision Performance and Scalability – you need to be able to handle extremely high volumes of decisions and to maintain performance as the complexity of the logic evolves over time.
  • Business User Control – to reap the greatest benefit from the cloud, you need to give the business experts a safe and effective environment for managing their own decisions without being dependent on IT.

SulAmerica is showing how companies can take advantage of evolving technology to deliver faster and better decisions to reduce risk, cut costs, and drive profitable growth.  You can read the full press release here

Search Site


  • dmblog.fico.com

Subscribe

  • enter your email

Upcoming Events

  • FICO Tools & Analytics User Forum 2012
    BERLIN: September 11-12, 2012 LONDON: September 18-19, 2012 Gain new insights for improving business performance through advanced analytics and decision management tools.