1 / 26

OPTIMIZING MARKETING DATABASES UNDER UNCERTAINTY :

OPTIMIZING MARKETING DATABASES UNDER UNCERTAINTY :. A REAL OPTIONS APPROACH A. Even G. Shankaranarayanan P. Berger. 1.

Télécharger la présentation

OPTIMIZING MARKETING DATABASES UNDER UNCERTAINTY :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OPTIMIZING MARKETING DATABASES UNDER UNCERTAINTY: A REAL OPTIONS APPROACH A. Even G. Shankaranarayanan P. Berger 1

  2. Can maximization of economic performance direct parameters of a marketing database?We posit that economic tradeoffs should also be considered, in addition to technical and functional requirements.In this audience, I assume that I do not need to argue the importance of marketing databases!!!! 2

  3. USAGES: We break up the myriad of uses of a marketing database into two types: • ROUTINE USAGE (e.g., results of direct mail campaigns, queries to identify segments of customers, etc.) We assume that the benefits gained from routine usage are relatively easy to assess and can be viewed, for practical purposes, as deterministic, and use relatively inexpensive data resources. 3

  4. 2) EXPLORATIVE USAGE Data-usage activities that are not routinely repetitive (e.g., selected data mining, complex statistical analyses, evaluation of adding or relocating warehouses) and may be at a “higher level” (e.g., evaluate a major change in marketing/business strategy). We assume that the benefits gained from explorative usage cannot be predicted with certainty, and are likely to use more sophisticated technology and, thus, need higher investment in data resources and information technology. 4

  5. We examine two key “decision variables” in terms of the design of the “data warehouse” that supports the marketing decisions – CAPACITYand TIMING. CAPACITY The volume of data and varied usages dictate capacity requirements in terms of storage, processing, presentation and delivery. Higher capacity, determined by design choices, such as IT selection and data-resource configuration, usually imply a higher cost. 5

  6. Certain data warehouse choices may optimize the capacity to specific needs and cost less, while offering less capability for growth/expansion. Others are more scalable and flexible, but might require larger initial investments. 6

  7. TIMING - Given uncertainty in the “utility” [benefit] to be gained, a superior strategy may be to defer investments to later stages, when the uncertainty is reduced or eliminated. However, the consequence of such delays often involve some penalty, due to opportunity costs and/or switching costs. 7

  8. HIGH LEVEL DESIGN STRATEGIES: • “Maximize” – Invest (initially) in full DW capacity to accommodate both routine and explorative usages. • “Switch” – Initially, implement a low-cost and less expensive solution to support routine usage, and later, AFTER EVALUATION OF EXPLORATIVE USAGE, switch to a high-end, more expensive solution, IF WARRANTED. 8

  9. “Upgrade” – Initially, implement a low-cost and less expensive design to support routine usage, and later, AFTER EVALUATION OF EXPLORATIVE USAGE, upgrade to a high-end, more expensive design, IF WARRANTED. (“Switch” replaces initial, “Upgrade” adds to it) • “Postpone” – Defer all decisions to a later stage and identify optimal solution after all the uncertainty is resolved. 9

  10. Each of these strategies has different fixed (initial) costs, different evaluation costs and different “penalty” costs due to delay. All strategies involve some expected values, to reflect uncertainty in receipt of explorative utility. BASIC MODEL OF UTILITY AND COSTS: 10

  11. Bk = Uk(X) – Ck(X) = SUik(X) – SCjk(X),where i j X = A vector a design characteristics (e.g., certain fields of data on 0,1 basis, OR number of records retained, along a 0 1 continuum) Bk = The NET-benefit associated with configuration k (out of K) Uik(X), Uk(X) – Utility associated with I usages (indexed by i), and overall utility within system configuration k Cjk(X), Ck(X) – Utility associated with J cost factors (indexed by j), and overall cost associated with configuration k 11

  12. The index k reflects high-level design decisions among K alternatives (e.g., data provider, database management software). Once k is decided, Bk (or its NPV) is to be maximized by choice of X, subject, perhaps, to constraints. The previous page states everything as deterministic, while we extend the problem to being stochastic. 12

  13. To repeat - the decision variables – CAPACITY – the capabilities and resources needed to support usage needs. Excess capacity is a known way to deal with uncertainty (in usage needs and utility), but may be costly. E.g., doubling the data volume might require a new database server. TIMING– the deferral or not of some investments in the DW system to reflect uncertainties in the utilities. Deferral may involve a delay penalty, as failure to support usage needs in a timely manner can reduce utility. 13

  14. We assume that routine usages contribute UA, with certainty (with, for all practical purposes, probability, PA = 1). We treat explorative usages differently. Utility is primarily assumed to be a Bernoulli random variable: UB with probability PB, 0 with probability (1-PB), PB < 1. We have Evaluation time, TE, and cost CE. After TE, at a cost of CE,we assume here that we can determine with certainty if explorative usage will be successful. (Mild extension can accommodate case of only some of the uncertainty resolved.) We assume that delay reduces utility and the “sensitivity” due to time delay is D(T), where T goes from 0 to 1, and D(T) is a decreasing function of T. D(T=0)=1, D(T= ∞ )=0. Here, with t = time sensitivity parameter, we use D(T) = exp(-tT) Delay = lag between identifying usage needs and implementing a way to support them. 14

  15. We have Uik = Pi•Di(Tk)•U*i •P(Xg)bi,g where • Uik– The utility of usage [i] within system configuration [k] (here, routine usage: i = A, versus explorative: i = B). • Pi - The utility probability of usage [i] (here, assuming PA=1, PB<1) • Di - The time-delay penalty of usage [i] • Tk - The time delay associated with configuration [k] • U*i–The maximum utility of usage [i] • Xg – Component [g] (out of G) of the design characteristic vector X • βi,g – Utility [i] sensitivity parameter to Xg (can be either < 1, concave; or > 1, convex; or = 1, linear.) If an X is either 0 or 1, excluded or included, and a particular X is not needed for a specific usage, bi,g = 0 and we assume that Xgbi,g = 0 when both X and b = 0 15

  16. COST and TIME TO IMPLEMENT • We have a “high-end” configuration with high capacity to support all usages (routine and explorative), and which has a relatively high cost (CH) and has relatively high implementation time (TH). • We have a “low-end” configuration with low capacity -supports routine usage, but NOT explorative usage, and has a relatively low cost (CL) and takes a shorter time to implement (TL). {problem with “low-end” is that if later there is a desire to get larger, initial configuration is “wasted.”} • CL < CH, TL < TH 16

  17. There is also the “middle ground” cost and time, involved with “upgrading.” We have Foundation cost, CF, and TF, such that CL < CF < CH and TL < TF < TH Also, there is an upgrade cost, CU which takes time TU, such that (CF + CU) > CH and (TF + TU > TH) (So, if one knewin advance that they were going to upgrade, it is better to do it all at once at the beginning. – But, of course, one doesn’t know – indeed, this is the real-option side of things) 17

  18. MAXIMIZE (M) 18

  19. SWITCH 19

  20. UPGRADE 20

  21. POSTPONE

  22. In a given situation, for each of the 4 strategies, one can evaluate the expected net benefit (by “working back” the decision-flow diagram), and choose the maximum Expected Net Benefit. It is certainly possible for any of the 4 to be optimal, depending on the values of the P’s, the U’s, the C’s, T’s, X’s, b’s, t’s, and the D(T) functions. 22

  23. TABLE TO HIGHLIGHT EXPECTED UTILITIES, EXPECTED COSTS, AND WHEN EACH STRATEGY MAY BE OPTIMAL: 23

  24. EXAMPLE – Using loyalty cards Goal to link individual transactions to individual customers. Considering a DW to maintain this integrated data. Routine Usage: basic customer details, including some demographics, some segmentation and some individualizing of promotions. Data comes from current CRM system. Explorative Usage: support strategic decisions such as analyzing revenue potential of new locations, developing new promotional policies, identifying changes in purchase patterns. Data from CRM system AND outside vendor. 25

  25. A draft of a paper has various detailed examples included. • Key macro issues: Taking into account the economic consequences of designing a DW environment to support marketing decisions. Framework considers such issues as optimizing data quality, how much data to “carry along,” and quantifies the issues involving choice among the 4 strategies of “maximize,” “switch,” “upgrade,” and “postpone,” and can highlight the value of the “real options” available in terms of timing of decisions. 26

More Related