WO2011016731A1 - Automatic graphic generation - Google Patents

Automatic graphic generation Download PDF

Info

Publication number
WO2011016731A1
WO2011016731A1 PCT/NZ2009/000155 NZ2009000155W WO2011016731A1 WO 2011016731 A1 WO2011016731 A1 WO 2011016731A1 NZ 2009000155 W NZ2009000155 W NZ 2009000155W WO 2011016731 A1 WO2011016731 A1 WO 2011016731A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
parameters
colour
pattern
participant information
Prior art date
Application number
PCT/NZ2009/000155
Other languages
French (fr)
Inventor
Peter Guy Evans
Regan Grey Tuck
Original Assignee
New Zealand Racing Board
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Zealand Racing Board filed Critical New Zealand Racing Board
Priority to PCT/NZ2009/000155 priority Critical patent/WO2011016731A1/en
Publication of WO2011016731A1 publication Critical patent/WO2011016731A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • the invention relates to systems and method for automated generation of graphics.
  • the invention relates to automated generation of graphics representative of participants in a wagering event.
  • Wagering is possible on a wide variety of events, including horse races, dog races, sports events (e.g. rugby or cricket games) and the like. In many of these events a particular participant in the event will be associated with a set of colours. rugby or football teams for example will have a coloured playing strip, while greyhounds wear a coloured "rug”. In horse racing each jockey or harness driver wears silks of a distinctive colouration and/or pattern. By means of these colours and/or patterns, it is possible to distinguish the participants from each other.
  • the invention provides a computer-implemented method of generating graphics representing participants in a wagering event, including the steps of:
  • participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information:
  • the received information includes item information, the method including identifying item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
  • the method includes marking each identified colour or pattern parameter as a colour parameter or a pattern parameter.
  • the method includes marking each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter.
  • the method includes identifying any unknown parameters and marking them as unknown parameters.
  • the item information identifies one or more items of a jockey's uniform, from the group: body, sleeves, cap, collar, cuffs and epaulettes.
  • the method includes dividing the received participant information into elements and/or sub-elements.
  • the method includes dividing the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub- element including a single word.
  • the method includes determining whether an element includes a single word, and if not, analysing interactions between parameters within that element.
  • the method includes determining whether an operator is included within an element and, if so, grouping parameters within the element in accordance with a grouping rule set.
  • the method includes expanding the grouped parameters in accordance with an expansion rule set.
  • the method includes storing the elements and/or sub-elements in memory.
  • the method includes identifying unknown parameters in the participant information; and displaying a user prompt to allow a user to define or correct the unknown parameter.
  • the method includes identifying unknown parameters in the participant information; and attempting to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters.
  • the method includes, if the attempt to analyse the unknown parameter is unsuccessful, displaying a user prompt to allow a user to define or correct the unknown parameter.
  • the method includes applying a series of masks or layers to the template in accordance with the analysed participant information.
  • the method includes applying a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information.
  • the method includes applying a mask or layer for each pattern identified in the participant information.
  • Preferably applying a mask involves importing a mask or layer file in accordance with the analysed participant information and applying that mask or layer file to the template.
  • the received participant information is in the form of a manually input text description.
  • the manually input text description is retrieved from memory.
  • the method includes the step of filtering variants from the received participant information.
  • the variants are variants of known words and/or separators.
  • the graphics are representative of the silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race.
  • the graphics may be representative of a dog's rug and the wagering event may be a dog race.
  • the graphics may be representative of a player's uniform and the wagering event may be a sporting event.
  • the invention provides a computer readable medium having encoded thereon computer instructions for computer implementation of the method of the first aspect.
  • the invention provides a graphic generation system for generating graphics representative of participants in a wagering event, the system including:
  • a processor configured to receive participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information, and to perform the following steps:
  • the received information includes item information
  • the processor being configured to identify item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
  • the processor is configured to mark each identified colour or pattern parameter as a colour parameter or a pattern parameter.
  • the processor is configured to mark each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter.
  • the processor is configured to identify any unknown parameters and mark them as unknown parameters.
  • the item information identifies one or more items of a jockey's uniform, from the group: body, sleeves, cap, collar, cuffs and epaulettes.
  • the processor is configured to divide the received participant information into elements and/or sub-elements.
  • the processor is configured to divide the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub-element including a single word.
  • the processor is configured to determine whether an element includes a single word, and if not, to analyse interactions between parameters within that element.
  • the processor is configured to determine whether an operator is included within an element and, if so, to group parameters within the element in accordance with a grouping rule set.
  • the processor is configured to expand the grouped parameters in accordance with an expansion rule set.
  • the processor is configured to store the elements and/or sub-elements in the memory.
  • the system includes a user display device and a user input device, the processor being configured to identify unknown parameters in the participant information; and to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device.
  • the processor is configured to identify unknown parameters in the participant information; and to attempt to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters.
  • the system includes a user display device and a user input device, the processor being configured, if the attempt to analyse the unknown parameter is unsuccessful, to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device.
  • the processor is configured to apply a series of masks or layers to the template in accordance with the analysed participant information.
  • the processor is configured to apply a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information.
  • the processor is configured to apply a mask or layer for each pattern identified in the participant information.
  • the processor is configured to import a mask or layer file in accordance with the analysed participant information and apply that mask or layer file to the template.
  • the received participant information is in the form of a manually input text description.
  • the processor is configured to retrieve the manually input text description from the memory.
  • the processor is configured to filter variants from the received participant information.
  • the variants are variants of known words and/or separators.
  • the graphics are representative of the silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race.
  • the graphics may be representative of a dog's rug and the wagering event may be a dog race.
  • the graphics may be representative of a player's uniform and the wagering event may be a sporting event.
  • Figure 1 is a flow diagram illustrating a method according to one embodiment
  • Figure 2 is a flow diagram illustrating the decoding step of Figure 1 ;
  • Figure 3 is a flow diagram illustrating the grouping step of Figure 2;
  • Figure 4 is a flow diagram illustrating the graphic generation step of Figure 1 ;
  • Figure 5 is a schematic drawings of a graphic generation system according to one embodiment.
  • the invention provides a method and system for generating graphics representing participants in a wagering event.
  • Information on jockey silks is generally stored in a text description, such as: "black, white stripes and green armbands, yellow cap with pink spots”. This indicates that the silk is black with white stripes and green armbands.
  • the jockey's cap is yellow with pink spots.
  • Many different colours and patterns are possible. In addition to the many colours which can be used, patterns including stripes, bands, spots, stars, crosses, polka dots, and others can be applied to the body, sleeves, cap, collar, cuffs or epaulettes of a jockey's uniform. Any combination of these colours and patterns is possible.
  • jockey silk descriptions are estimated to be accurate for only 88% of Australian jockey silks. Spelling errors are typical.
  • non-standard descriptions occur from time to time (e.g. "Green frogs” or “Orange dog in circle”).
  • separators e.g. and, &, with, ",", ";" vary between descriptions.
  • a description of the jockey silk is received.
  • the silk description may be retrieved from memory or input directly by a user.
  • the silk description may for example be: "black, white stripes and green armbands, yellow cap with pink spots”.
  • a filtering process is carried out. This is the first of several steps designed to remove any variants or irregularities and create a standard description which can be understood by the system. These manipulations of the description string allow the following logic to work better.
  • the filtering step may involve detecting non-standard separators or descriptors and replacing these with standard separators or descriptors.
  • the symbol "&" can be replaced by the word “and”; the abbreviation “dk.” can be replaced by the word “dark”.
  • the standard separators or descriptors may vary from system to system. In some embodiments it may be desirable to replace all separators with a standard separator, such as ",”.
  • Spelling errors can also be corrected at this stage by comparison to a set of known variants.
  • a list or look up table of variants for the word "sleeves” could include the following common errors: slevees, slaves, sleaves, slvees, slevs, slevas. Thus any one of these variants will be replaced by the correct word "sleeves" Alternatively, such spelling errors could be dealt with later in the analysis process.
  • the description is split up into primary elements defined by the separator "," and stored in an array.
  • the data may be stored in any other suitable manner, including for example in a relational database.
  • each array element is split up again into individual words, with each word being stored as a sub-element.
  • Our example now becomes:
  • each sub-element is then compared against item, colour and pattern parameters which are known to the program (e.g. from a look up table). For example, if a word is known to represent an item (body, sleeves, cap, collar, cuffs, epaulettes etc) it marks the word as an item (e.g. using a code, such as an T). Likewise the system marks colours it recognises (e.g. using a 1 C) 1 or patterns (e.g. using a 1 P 1 ). Any words or sub-elements which are not known to the system are marked as unknowns (e.g. using a 'U 1 ). Operators (such as 'and') may be marked with an '0'. In our example we would end up with codes as shown in Table 3.
  • Table 3 Now the system begins to analyse or decode the information, at step 6. First a math like function is formed from each primary element, and the function is expanded into its most basic form - an item with a pattern, an item with a colour, and a pattern with a colour.
  • Figure 2 shows the analysis or decoding process in more detail.
  • each sub-element is checked in the context of its primary element to see if the primary element includes any other words. If not then the system can determine at step 21 that this is a single word, and so long as it is not an unknown word, no further decoding is required at this stage. If, however, there are words in other sub- elements of the same primary element, the system must check to see how those words interact with each other.
  • a user prompt may be displayed seeking user correction or definition of the parameter.
  • the user corrected or defined parameter may be added to a list or look up table of known parameters for future reference by the graphic generation system.
  • the system starts to group parameters together in such a way that a property can be applied to multiple things.
  • An example might be 'yellow sleeves and cap'.
  • the colour yellow must be applied to the sleeves and the cap, not just the sleeves. This is achieved through a form of factorisation, i.e. yellow(sleeves and cap).
  • the brackets serve as they would in a mathematical function with the colour yellow applying to all of the objects within the brackets. This is essential to the next step, which is to expand the functions into their most fundamental form.
  • the grouping process is shown in further detail in Figure 3.
  • the system performs a preliminary grouping 30, by analysing the element and looking at the codes associated with each sub-element. For example, in the 2nd element (array element 1 ) there are 5 sub-elements. Combining their codes gives us 'CPOCP' for "white stripes and green armbands' (colour pattern operator colour pattern).
  • the system looks at the combined code generated and determines if it is a pattern that is recognised, or one that cannot be interpreted. If it cannot be interpreted, user input must be sought to assist the graphic generation system. If the pattern can be recognised then the system begins hunting through the code deciding how to group things together.
  • the system uses a set of rules for preliminary grouping, including rules such as:
  • each primary element is searched for the existence of an operator. If an operator is found, this flags that further grouping is required, in which case a further grouping rule set is applied at step 32.
  • This rule set may include such rules as colour pattern group rules (see below) to build the overall graphic.
  • the code from 'white stripes' is CP
  • the code from 'green armbands' is CP
  • the system applies rules for 'CPCP" cases. Here it is renamed at step 33 as a CPG or colour pattern group. It is then grouped as CPG(CP 1 CP). This essentially tells the program that there is a group of colours and patterns, and within it are the colours and patterns of each object.
  • step 25 is to expand these groupings or functions. We will follow the example through this stage.
  • the first primary element only contains the colour black (C). As this is in the first element the system applies an expansion rule which determines that this is the main colour of the body. It then makes the body this colour, while also defining it as the primary colour and the current colour.
  • the second primary element contains the code CPG(CP 1 CP).
  • the system therefore applies expansion rules for CPG cases.
  • the system analyses the two sets of 'CP' parameters. The system determines that the first pattern is stripes, and that although stripes can be applied to the body, sleeves, and cap, that the body's pattern has not been defined yet (although it may have no pattern). It then decides that as the last item accessed was the body that the pattern belongs to it. It then decides that the colour applies to the pattern, so the body is now black with white stripes.
  • step 7 the decoded description is sent to an image layering routine for graphic generation.
  • the graphic generation process is shown in greater detail in Figure 4.
  • the system looks at what base colours were given to each item.
  • the systems applies the base colours to a base mask or layer.
  • the base mask or layer may be a template imported from a saved file.
  • the template preferably represents the nature of the participant's uniform or the wagering event. So, in the case of jockey silks the template will include a jacket and cap which can be coloured according to the jockey silk description. From our example the body and sleeves would have the base colour 'black', while the cap would have the base colour 'yellow'.
  • the system looks at the patterns and their colours. The stripes are coloured white, the armbands green, and the caps spots pink.
  • An image mask or layer is created for each pattern and its colours at step 43.
  • each pattern mask or layer is applied over the top of the base mask or layer to create the final graphic.
  • the system applies a layering technique to build up the final image from the decoded text description.
  • Colours, patterns, and body segment files are located by linking the filename with the description, i.e. body stripes may be found in a '../body/stripes. png 1 directory/file.
  • the complete image may be given any desired backdrop, marked with a copyright symbol and stored for future use (step 8 of Figure 1 ).
  • the image must be stored in some way associating it with the correct participant.
  • the graphic's file name may simply be the description file name with spaces replaced with '_'.
  • the decoded description in a standardised form may be stored, or both the decoded description and the graphic may be stored.
  • the stored graphic can be retrieved for generation of wagering media, such as for use in racing guides in paper or electronic form, for use on websites including betting websites, or for use in graphic overlays during broadcasting of wagering events. Other uses of the graphics may occur to the skilled reader.
  • the logical rules allow the system to interpret the silk description, but may on occasion return an incorrect graphic.
  • the system may allow a user to check the completed graphics against the original description.
  • the Applicant's system involves the use of flexible rule sets. New rules can be added to those rule sets as required.
  • the rule sets can easily be adapted to cater for different types of information, such as information on different types of participant (jockey / harness driver / greyhound / soccer team).
  • FIG. 5 shows a computer system for implementation of the Applicant's method.
  • Memory 50 holds computerised participant information, which may have been manually input at any time by a user.
  • the information may be stored in a database or in any other desired form.
  • Memory 50 also stores other information required by the system, such as lists or look up tables of known item, colour and pattern parameters, variants, spelling errors etc.
  • a processor 51 extracts the information from the memory 50 and performs the above steps in order to create a graphic representing a participant.
  • the processor 51 includes a number of modules as follows.
  • a preliminary processing module 51a is configured to receive the participant information and to perform a number of preliminary processing steps, preferably including filtering the data and dividing the data into elements and sub-elements, as described above.
  • a decoding or analysis module 51 b then receives the processed data (for example the elements and sub-elements) and implements the decoding or analysis steps discussed above with reference to Figures 2 and 3.
  • the decoded information is received by a graphic generation module 51c which applies the decoded information to a template (which may be retrieved from memory 50) in order to generate a graphic in any desired form (e.g.
  • An output module 51 d controls output of the graphic and/or analysed information in a standard form to the various output devices (e.g. the display, electronic publishing system or hard copy publishing system discussed below).
  • the modules may be, or may incorporate, rules engines (e.g. rules engine 51 e) or may rely on rules engines 51 f, 51 g in order to implement the steps of the method described above.
  • the graphic and/or any intermediate information may be stored in memory 50.
  • User interface devices may be provided to allow a user to control the graphic generation system and to facilitate manual intervention where that is required (for example in order to deal with unknown parameters).
  • the system of Figure 5 includes a number of output devices 52 and one or more user input devices 53.
  • the output devices 52 may include displays (e.g. screens), printers or other desired output devices.
  • the user input devices 53 may include keyboards, pointing devices, touch screens, microphones (with the processor equipped with speech recognition software) or any other desired input device. Any suitable computer-readable medium may be used to store computer instructions for carrying out the steps of the Applicant's method.
  • the processor is preferably linked to a number of other output systems.
  • An electronic publishing system 55 receives graphic information from the processor 51 and combines this with other information, such as form information, odds information, horse name, event information (e.g. race name, race time, track conditions) and any other information which it may be desired to display. This information can then be used to create an electronic publication, such as an online wagering guide or website, which can be displayed to the public in any desired manner, including using a web browser or other suitable software for display on a computer, portable electronic device, mobile phone, e-book reader or any other suitable user display equipment.
  • the electronic publication may be stored in the memory 50, or in external memory 56 for access by those wishing to read the publication.
  • the output systems preferably also include a hard copy publishing system 58.
  • the hard copy publication system receives graphic information from the processor 51 and combines this with other information, such as form information, odds information, horse name, event information (e.g. race name, race time, track conditions) and any other information which may be desired. This information is then sent to a hard copy printing system 59 for printing of physical publications such as racing guides, form books or any other desired form of publication.

Abstract

A method and system for generating a graphic representing a participant in a wagering event, receives information relating to the participant's uniform or some other information associated with the participant, and generates a graphic in accordance with that information. The information may concern the colours and patterns of a jockey's silks. Colour and/or pattern parameters within the participant information are identified using a comparison to a set of known colour and/or pattern parameters. The parameters are then analysed in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters. The analysed information is applied to a template, to create a graphic representing the participant information; and the graphic and/or the analysed participant information in a standard form can be stored in memory.

Description

AUTOMATIC GRAPHIC GENERATION FIELD OF THE INVENTION The invention relates to systems and method for automated generation of graphics.
In particular the invention relates to automated generation of graphics representative of participants in a wagering event.
BACKGROUND TO THE INVENTION
Wagering is possible on a wide variety of events, including horse races, dog races, sports events (e.g. rugby or cricket games) and the like. In many of these events a particular participant in the event will be associated with a set of colours. Rugby or football teams for example will have a coloured playing strip, while greyhounds wear a coloured "rug". In horse racing each jockey or harness driver wears silks of a distinctive colouration and/or pattern. By means of these colours and/or patterns, it is possible to distinguish the participants from each other.
It would be desirable to include symbols representing the colours and/or patterns of participants in wagering material, such as guides or websites. Thus, a horse racing guide or website could include printed symbols representing the silks worn by a jockey or harness driver, showing the colouration and pattern of the jacket and/or cap. However, creation of such graphics would be extremely time-consuming and therefore expensive. This problem is exacerbated by the very large number of silks which are currently registered. In New Zealand, for example, there are currently around 3000 registered silks. Creating graphics for each silk manually, at say 5 minutes per silk, would take 250 hours. In bigger countries the problem is even worse - with 15,000 silks registered in Australia, for example. It is an object of the invention to provide an improved method of creating graphics representative of wagering participants, or at least to provide the public with a useful choice. SUMMARY OF THE INVENTION
In a first aspect the invention provides a computer-implemented method of generating graphics representing participants in a wagering event, including the steps of:
receiving participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information:
identifying colour and/or pattern parameters within the participant information using a comparison to a set of known colour and/or pattern parameters;
analysing the participant information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
applying the analysed participant information to a template, to create a graphic representing the participant information; and
storing in memory the graphic, and/or the analysed participant information in a standard form.
Preferably the received information includes item information, the method including identifying item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
Preferably the method includes marking each identified colour or pattern parameter as a colour parameter or a pattern parameter. Preferably the method includes marking each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter. Preferably the method includes identifying any unknown parameters and marking them as unknown parameters. Preferably the item information identifies one or more items of a jockey's uniform, from the group: body, sleeves, cap, collar, cuffs and epaulettes. Preferably the method includes dividing the received participant information into elements and/or sub-elements. Preferably the method includes dividing the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub- element including a single word.
Preferably the method includes determining whether an element includes a single word, and if not, analysing interactions between parameters within that element.
Preferably the method includes determining whether an operator is included within an element and, if so, grouping parameters within the element in accordance with a grouping rule set. Preferably the method includes expanding the grouped parameters in accordance with an expansion rule set.
Preferably the method includes storing the elements and/or sub-elements in memory.
Preferably the method includes identifying unknown parameters in the participant information; and displaying a user prompt to allow a user to define or correct the unknown parameter.
Preferably the method includes identifying unknown parameters in the participant information; and attempting to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters. Preferably the method includes, if the attempt to analyse the unknown parameter is unsuccessful, displaying a user prompt to allow a user to define or correct the unknown parameter. Preferably the method includes applying a series of masks or layers to the template in accordance with the analysed participant information. Preferably the method includes applying a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information. Preferably the method includes applying a mask or layer for each pattern identified in the participant information. Preferably applying a mask involves importing a mask or layer file in accordance with the analysed participant information and applying that mask or layer file to the template. Preferably the received participant information is in the form of a manually input text description. Preferably the manually input text description is retrieved from memory.
Preferably the method includes the step of filtering variants from the received participant information. Preferably the variants are variants of known words and/or separators.
Preferably the graphics are representative of the silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race. Alternatively the graphics may be representative of a dog's rug and the wagering event may be a dog race. Alternatively the graphics may be representative of a player's uniform and the wagering event may be a sporting event.
In a second broad aspect the invention provides a computer readable medium having encoded thereon computer instructions for computer implementation of the method of the first aspect.
In a third broad aspect the invention provides a graphic generation system for generating graphics representative of participants in a wagering event, the system including:
memory; and a processor configured to receive participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information, and to perform the following steps:
identify colour and/or pattern parameters within the participant information using a comparison to a set of known colour and/or pattern parameters;
analyse the participant information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
apply the analysed participant information to a template, to create a graphic representing the participant information; and
store in the memory the graphic, and/or the analysed participant information in a standard form.
Preferably the received information includes item information, the processor being configured to identify item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
Preferably the processor is configured to mark each identified colour or pattern parameter as a colour parameter or a pattern parameter. Preferably the processor is configured to mark each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter.
Preferably the processor is configured to identify any unknown parameters and mark them as unknown parameters.
Preferably the item information identifies one or more items of a jockey's uniform, from the group: body, sleeves, cap, collar, cuffs and epaulettes. Preferably the processor is configured to divide the received participant information into elements and/or sub-elements. Preferably the processor is configured to divide the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub-element including a single word. Preferably the processor is configured to determine whether an element includes a single word, and if not, to analyse interactions between parameters within that element.
Preferably the processor is configured to determine whether an operator is included within an element and, if so, to group parameters within the element in accordance with a grouping rule set. Preferably the processor is configured to expand the grouped parameters in accordance with an expansion rule set.
Preferably the processor is configured to store the elements and/or sub-elements in the memory.
Preferably the system includes a user display device and a user input device, the processor being configured to identify unknown parameters in the participant information; and to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device.
Preferably the processor is configured to identify unknown parameters in the participant information; and to attempt to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters. Preferably the system includes a user display device and a user input device, the processor being configured, if the attempt to analyse the unknown parameter is unsuccessful, to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device. Preferably the processor is configured to apply a series of masks or layers to the template in accordance with the analysed participant information. Preferably the processor is configured to apply a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information. Preferably the processor is configured to apply a mask or layer for each pattern identified in the participant information. Preferably the processor is configured to import a mask or layer file in accordance with the analysed participant information and apply that mask or layer file to the template. Preferably the received participant information is in the form of a manually input text description. Preferably the processor is configured to retrieve the manually input text description from the memory.
Preferably the processor is configured to filter variants from the received participant information. Preferably the variants are variants of known words and/or separators.
Preferably the graphics are representative of the silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race. Alternatively the graphics may be representative of a dog's rug and the wagering event may be a dog race. Alternatively the graphics may be representative of a player's uniform and the wagering event may be a sporting event.
BRIEF DESCRIPTION OF THE DRAWINGS The invention will now be described by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a flow diagram illustrating a method according to one embodiment;
Figure 2 is a flow diagram illustrating the decoding step of Figure 1 ;
Figure 3 is a flow diagram illustrating the grouping step of Figure 2; Figure 4 is a flow diagram illustrating the graphic generation step of Figure 1 ; and
Figure 5 is a schematic drawings of a graphic generation system according to one embodiment.
DETAILED DESCRIPTION
The invention provides a method and system for generating graphics representing participants in a wagering event.
The invention will be described with reference to jockey silks and horse racing. However, the skilled reader will understand that the invention is equally applicable to other participant information such as colours or uniforms identifying participants in an event, for example greyhound rugs and dog racing, playing strip for a sports team etc.
Information on jockey silks is generally stored in a text description, such as: "black, white stripes and green armbands, yellow cap with pink spots". This indicates that the silk is black with white stripes and green armbands. The jockey's cap is yellow with pink spots. Many different colours and patterns are possible. In addition to the many colours which can be used, patterns including stripes, bands, spots, stars, crosses, polka dots, and others can be applied to the body, sleeves, cap, collar, cuffs or epaulettes of a jockey's uniform. Any combination of these colours and patterns is possible.
Such records are usually manually created and therefore include significant numbers of errors. For example, jockey silk descriptions are estimated to be accurate for only 88% of Australian jockey silks. Spelling errors are typical. In addition, non-standard descriptions occur from time to time (e.g. "Green frogs" or "Orange dog in circle"). In addition, separators (e.g. and, &, with, ",", ";") vary between descriptions.
Interpretation of the jockey silk descriptions is therefore difficult. The Applicant has created a method and system for automated graphic generation as follows. Referring to Figure 1 , at step 1 a description of the jockey silk is received. The silk description may be retrieved from memory or input directly by a user. The silk description may for example be: "black, white stripes and green armbands, yellow cap with pink spots". At step 2 a filtering process is carried out. This is the first of several steps designed to remove any variants or irregularities and create a standard description which can be understood by the system. These manipulations of the description string allow the following logic to work better. The filtering step may involve detecting non-standard separators or descriptors and replacing these with standard separators or descriptors. For example, the symbol "&" can be replaced by the word "and"; the abbreviation "dk." can be replaced by the word "dark". The standard separators or descriptors may vary from system to system. In some embodiments it may be desirable to replace all separators with a standard separator, such as ",".
Various other data cleansing processes may be carried out as part of this filtering step, such as clearing blank spaces or full stops. Following the filtering step the silk description will read: "black, white stripes and green armbands, yellow cap, pink spots".
Spelling errors can also be corrected at this stage by comparison to a set of known variants. For example a list or look up table of variants for the word "sleeves" could include the following common errors: slevees, slaves, sleaves, slvees, slevs, slevas. Thus any one of these variants will be replaced by the correct word "sleeves" Alternatively, such spelling errors could be dealt with later in the analysis process.
At step 3, the description is split up into primary elements defined by the separator "," and stored in an array. Alternatively the data may be stored in any other suitable manner, including for example in a relational database.
Our example above yields the following array:
Figure imgf000011_0001
Table 1
At step 4 each array element is split up again into individual words, with each word being stored as a sub-element. Our example now becomes:
Figure imgf000011_0002
Table 2 The description has now been broken down into individual words which can be compared to known words. At step 5 each sub-element is then compared against item, colour and pattern parameters which are known to the program (e.g. from a look up table). For example, if a word is known to represent an item (body, sleeves, cap, collar, cuffs, epaulettes etc) it marks the word as an item (e.g. using a code, such as an T). Likewise the system marks colours it recognises (e.g. using a 1C)1 or patterns (e.g. using a 1P1). Any words or sub-elements which are not known to the system are marked as unknowns (e.g. using a 'U1). Operators (such as 'and') may be marked with an '0'. In our example we would end up with codes as shown in Table 3.
Figure imgf000012_0001
Table 3 Now the system begins to analyse or decode the information, at step 6. First a math like function is formed from each primary element, and the function is expanded into its most basic form - an item with a pattern, an item with a colour, and a pattern with a colour. Figure 2 shows the analysis or decoding process in more detail. At step 20 each sub-element is checked in the context of its primary element to see if the primary element includes any other words. If not then the system can determine at step 21 that this is a single word, and so long as it is not an unknown word, no further decoding is required at this stage. If, however, there are words in other sub- elements of the same primary element, the system must check to see how those words interact with each other.
Firstly the system must look to see if the combination of more than one word is required to match a particular object. For example, 'dark blue' is two words, but we have 'dark' and 'blue' in separate containers because of the above division of the description. For this description we would have generated the code 'U' for dark, and
1C for blue. The system at step 22 runs through different combinations, or compound parameters, in a table until it finds the one that best suits (based on a matching rule set), which for this case would be 1UC.
The system then searches at step 23 all of the available colours for a colour that matches the description 'dark blue'. This allows the system to pick up instances of colours made of more than one word. It also allows colours such as 'lime green1 to be combined into one code 1C rather than two ('lime' & 'green' = 'CC). A similar process can be performed for patterns.
If an unknown parameter cannot be successfully processed in this way, a user prompt may be displayed seeking user correction or definition of the parameter. The user corrected or defined parameter may be added to a list or look up table of known parameters for future reference by the graphic generation system.
In our example, none of the colours or patterns described are made up of multiple words so that nothing is altered in this step. At step 24 the system starts to group parameters together in such a way that a property can be applied to multiple things. An example might be 'yellow sleeves and cap'. In this case the colour yellow must be applied to the sleeves and the cap, not just the sleeves. This is achieved through a form of factorisation, i.e. yellow(sleeves and cap). The brackets serve as they would in a mathematical function with the colour yellow applying to all of the objects within the brackets. This is essential to the next step, which is to expand the functions into their most fundamental form.
The grouping process is shown in further detail in Figure 3. Initially the system performs a preliminary grouping 30, by analysing the element and looking at the codes associated with each sub-element. For example, in the 2nd element (array element 1 ) there are 5 sub-elements. Combining their codes gives us 'CPOCP' for "white stripes and green armbands' (colour pattern operator colour pattern). The system then looks at the combined code generated and determines if it is a pattern that is recognised, or one that cannot be interpreted. If it cannot be interpreted, user input must be sought to assist the graphic generation system. If the pattern can be recognised then the system begins hunting through the code deciding how to group things together. The system uses a set of rules for preliminary grouping, including rules such as:
a. grouping an unknown with a colour or pattern
b. grouping two colours that are next to each other to create a third colour
From our example: 'white' is grouped with 'stripes'; and 'green' is grouped with 'armbands'.
If the element had included the description 'yellow sleeves and cap' it would have broken down as: 'yellow' grouped with 'sleeves'; and 'cap' grouped with nothing - this would have been rectified by the next stage. At step 31 , each primary element is searched for the existence of an operator. If an operator is found, this flags that further grouping is required, in which case a further grouping rule set is applied at step 32. This rule set may include such rules as colour pattern group rules (see below) to build the overall graphic.
In our example the system would find that the 2nd primary element has an operator in it. This would then be handled as follows:
The code from 'white stripes' is CP
The code from 'green armbands' is CP
The system applies rules for 'CPCP" cases. Here it is renamed at step 33 as a CPG or colour pattern group. It is then grouped as CPG(CP1CP). This essentially tells the program that there is a group of colours and patterns, and within it are the colours and patterns of each object.
If we had "yellow sleeves and cap' at this stage it would have been grouped as GI(II) for a group of items (item 1 and item 2).
Returning to Figure 2, the next stage at step 25 is to expand these groupings or functions. We will follow the example through this stage.
The first primary element only contains the colour black (C). As this is in the first element the system applies an expansion rule which determines that this is the main colour of the body. It then makes the body this colour, while also defining it as the primary colour and the current colour.
The second primary element contains the code CPG(CP1CP). The system therefore applies expansion rules for CPG cases. Here the system analyses the two sets of 'CP' parameters. The system determines that the first pattern is stripes, and that although stripes can be applied to the body, sleeves, and cap, that the body's pattern has not been defined yet (although it may have no pattern). It then decides that as the last item accessed was the body that the pattern belongs to it. It then decides that the colour applies to the pattern, so the body is now black with white stripes.
Looking at the next CP it sees that it has the pattern armbands. A rule says that the pattern 'armbands' can only apply to the sleeves. It then associates the armbands with the sleeves and makes them green. So the sleeves now are black with green armbands.
If no colour is mentioned with a pattern or item, a rule requires that the previously mentioned colour is used. Thus "yellow sleeves and cap" would be interpreted as
"yellow sleeves and yellow cap".
Once all of the body segments have been defined by colour and pattern the system looks for any parts that have not been defined. Those parts are assumed to inherit the initial body colour. For example, 'black, white spots' will define black as the body colour, and give it white spots. However, the sleeves and cap colour are not specifically mentioned. In this case these body parts inherit the base colour of black.
Returning to Figure 1 , at step 7 the decoded description is sent to an image layering routine for graphic generation. The graphic generation process is shown in greater detail in Figure 4.
At step 40 the system looks at what base colours were given to each item. At step 41 the systems applies the base colours to a base mask or layer. The base mask or layer may be a template imported from a saved file. The template preferably represents the nature of the participant's uniform or the wagering event. So, in the case of jockey silks the template will include a jacket and cap which can be coloured according to the jockey silk description. From our example the body and sleeves would have the base colour 'black', while the cap would have the base colour 'yellow'. At step 42 the system looks at the patterns and their colours. The stripes are coloured white, the armbands green, and the caps spots pink. An image mask or layer is created for each pattern and its colours at step 43. At step 44, each pattern mask or layer is applied over the top of the base mask or layer to create the final graphic. Thus the system applies a layering technique to build up the final image from the decoded text description.
Colours, patterns, and body segment files are located by linking the filename with the description, i.e. body stripes may be found in a '../body/stripes. png1 directory/file.
The complete image may be given any desired backdrop, marked with a copyright symbol and stored for future use (step 8 of Figure 1 ). Clearly the image must be stored in some way associating it with the correct participant. In one embodiment the graphic's file name may simply be the description file name with spaces replaced with '_'. Alternatively the decoded description in a standardised form may be stored, or both the decoded description and the graphic may be stored.
The stored graphic can be retrieved for generation of wagering media, such as for use in racing guides in paper or electronic form, for use on websites including betting websites, or for use in graphic overlays during broadcasting of wagering events. Other uses of the graphics may occur to the skilled reader.
In several of the decoding steps above there are multiple options that it can choose from when decoding the data depending on how it has encoded the silk description.
The logical rules allow the system to interpret the silk description, but may on occasion return an incorrect graphic. The system may allow a user to check the completed graphics against the original description. The Applicant's system involves the use of flexible rule sets. New rules can be added to those rule sets as required. In addition, the rule sets can easily be adapted to cater for different types of information, such as information on different types of participant (jockey / harness driver / greyhound / soccer team).
Additional or different types of information can also be handled. For example, information relating to the nationality of a jockey or horse could be received by the system and used to generate a flag graphic or some other graphic indicative of the nationality. This could be incorporated with a jockey silk graphic (i.e. in a single graphic file) or could be stored as a separate graphic. The system relies on lists or look up table as discussed above. The system allows these lists or look up tables to be added to or altered either manually by a user, or automatically as part of the operation of the system (e.g. as a result of user-definition of a parameter previously unknown to the system). Figure 5 shows a computer system for implementation of the Applicant's method.
Memory 50 holds computerised participant information, which may have been manually input at any time by a user. The information may be stored in a database or in any other desired form. Memory 50 also stores other information required by the system, such as lists or look up tables of known item, colour and pattern parameters, variants, spelling errors etc.
A processor 51 extracts the information from the memory 50 and performs the above steps in order to create a graphic representing a participant. In particular, the processor 51 includes a number of modules as follows. A preliminary processing module 51a is configured to receive the participant information and to perform a number of preliminary processing steps, preferably including filtering the data and dividing the data into elements and sub-elements, as described above. A decoding or analysis module 51 b then receives the processed data (for example the elements and sub-elements) and implements the decoding or analysis steps discussed above with reference to Figures 2 and 3. The decoded information is received by a graphic generation module 51c which applies the decoded information to a template (which may be retrieved from memory 50) in order to generate a graphic in any desired form (e.g. a standard graphic file format). An output module 51 d controls output of the graphic and/or analysed information in a standard form to the various output devices (e.g. the display, electronic publishing system or hard copy publishing system discussed below). The modules may be, or may incorporate, rules engines (e.g. rules engine 51 e) or may rely on rules engines 51 f, 51 g in order to implement the steps of the method described above.
The graphic and/or any intermediate information (such as item and/or colour and/or pattern information reformatted to a standard form) may be stored in memory 50.
User interface devices may be provided to allow a user to control the graphic generation system and to facilitate manual intervention where that is required (for example in order to deal with unknown parameters). The system of Figure 5 includes a number of output devices 52 and one or more user input devices 53. The output devices 52 may include displays (e.g. screens), printers or other desired output devices. The user input devices 53 may include keyboards, pointing devices, touch screens, microphones (with the processor equipped with speech recognition software) or any other desired input device. Any suitable computer-readable medium may be used to store computer instructions for carrying out the steps of the Applicant's method.
In addition to the output devices 52, the processor is preferably linked to a number of other output systems. An electronic publishing system 55 receives graphic information from the processor 51 and combines this with other information, such as form information, odds information, horse name, event information (e.g. race name, race time, track conditions) and any other information which it may be desired to display. This information can then be used to create an electronic publication, such as an online wagering guide or website, which can be displayed to the public in any desired manner, including using a web browser or other suitable software for display on a computer, portable electronic device, mobile phone, e-book reader or any other suitable user display equipment. The electronic publication may be stored in the memory 50, or in external memory 56 for access by those wishing to read the publication. The output systems preferably also include a hard copy publishing system 58. The hard copy publication system receives graphic information from the processor 51 and combines this with other information, such as form information, odds information, horse name, event information (e.g. race name, race time, track conditions) and any other information which may be desired. This information is then sent to a hard copy printing system 59 for printing of physical publications such as racing guides, form books or any other desired form of publication.
While the present invention has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in detail, it is not the intention of the Applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, representative apparatus and methods, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of the Applicant's general inventive concept.

Claims

CLAIMS:
1. A computer-implemented method of generating graphics representing participants in a wagering event, including the steps of:
i. receiving participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information:
ii. identifying colour and/or pattern parameters within the participant information using a comparison to a set of known colour and/or pattern parameters;
iii. analysing the participant information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
iv. applying the analysed participant information to a template, to create a graphic representing the participant information; and
v. storing in memory the graphic, and/or the analysed participant information in a standard form. >
2. A method as claimed in claim 1 wherein the received information includes item information, the method including identifying item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
3. A method as claimed in claim 1 , including marking each identified colour or pattern parameter as a colour parameter or a pattern parameter.
4. A method as claimed in claim 2, including marking each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter.
5. A method as claimed in any preceding claim including identifying any unknown parameters and marking them as unknown parameters.
6. A method as claimed in claim 2 or 4 wherein the item information identifies one or more items of a uniform for a jockey or harness driver from the group: body, sleeves, cap, collar, cuffs and epaulettes.
7. A method as claimed in claim 1 including dividing the received participant information into elements and/or sub-elements.
8. A method as claimed in claim 7 including dividing the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub-element including a single word.
9. A method as claimed in claim 7 or 8 including determining whether an element includes a single word, and if not, wherein step iii) includes analysing interactions between parameters within that element.
10. A method as claimed in claim 7 or 8 wherein step iii) includes determining whether an operator is included within an element and, if so, grouping parameters within the element in accordance with a grouping rule set.
11. A method as claimed in claim 10 wherein step iii) also includes expanding the grouped parameters in accordance with an expansion rule set.
12. A method as claimed in claim 7 or 8 including storing the elements and/or sub- elements in memory.
13. A method as claimed in any preceding claim including: identifying unknown parameters in the participant information; and displaying a user prompt to allow a user to define or correct the unknown parameter.
14. A method as claimed in any one of claims 1 to 12 including: identifying unknown parameters in the participant information; and attempting to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters.
15. A method as claimed in claim 14 including, if the attempt to analyse the unknown parameter is unsuccessful, displaying a user prompt to allow a user to define or correct the unknown parameter.
16. A method as claimed in any preceding claim wherein step iv) includes applying a series of masks or layers to the template in accordance with the analysed participant information.
17. A method as claimed in claim 16 including applying a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information.
18. A method as claimed in claim 16 or 17, including applying a mask or layer for each pattern identified in the participant information.
19. A method as claimed in any one of claims 16 to 18 wherein applying a mask involves importing a mask or layer file in accordance with the analysed participant information and applying that mask or layer file to the template.
20. A method as claimed in any preceding claim wherein the received participant information is in the form of a manually input text description.
21. A method as claimed in claim 20 wherein the manually input text description is retrieved from memory.
22. A method as claimed in any preceding claim including, before step ii), the step of filtering variants from the received participant information.
23. A method as claimed in claim 22 wherein the variants are variants of known words and/or separators.
24. A method as claimed in any preceding claim wherein the graphics are representative of a silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race.
25. A method as claimed in any one of claims 1 to 23 wherein the graphics are representative of a dog's rug and the wagering event is a dog race.
26. A method as claimed in any one of claims 1 to 23 wherein the graphics are representative of a player's uniform and the wagering event is a sporting event.
27. A computer readable medium having encoded thereon computer instructions for computer implementation of the method of any one of claims 1 to 26.
28. A graphic generation system for generating graphics representative of participants in a wagering event, the system including:
i. memory; and
ii. a processor configured to receive participant information associated with a participant in a wagering event, wherein the participant information includes colour and/or pattern information, and to perform the following steps:
a. identify colour and/or pattern parameters within the participant information using a comparison to a set of known colour and/or pattern parameters; b. analyse the participant information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
c. apply the analysed participant information to a template, to create a graphic representing the participant information; and
d. store in the memory the graphic, and/or the analysed participant information in a standard form.
29. A system as claimed in claim 28 wherein the received information includes item information, the processor being configured to identify item parameters within the received information using a comparison to a set of known item parameters, and wherein the set of rules relates to allowed and/or likely interactions between item, colour and pattern parameters.
30. A system as claimed in claim 28, the processor being configured to mark each identified colour or pattern parameter as a colour parameter or a pattern parameter.
31. A system as claimed in claim 29, the processor being configured to mark each identified item, colour or pattern parameter as an item parameter, a colour parameter or a pattern parameter.
32. A system as claimed in any one of claims 28 to 31 the processor being configured to identify any unknown parameters and mark them as unknown parameters.
33. A system as claimed in claim 29 or 31 wherein the item information identifies one or more items of a jockey's uniform, from the group: body, sleeves, cap, collar, cuffs and epaulettes.
34. A system as claimed in claim 28 the processor being configured to divide the received participant information into elements and/or sub-elements.
35. A system as claimed in claim 34 the processor being configured to divide the received participant information into elements and sub-elements, each element being defined by one or more separators in the received participant information and each sub-element including a single word.
36. A system as claimed in claim 34 or 35 the processor being configured to determine whether an element includes a single word, and if not, to analyse interactions between parameters within that element.
37. A system as claimed in claim 34 or 35 the processor being configured to determine whether an operator is included within an element and, if so, to group parameters within the element in accordance with a grouping rule set.
38. A system as claimed in claim 37 the processor being configured to expand the grouped parameters in accordance with an expansion rule set.
39. A system as claimed in claim 34 or 35 the processor being configured to store the elements and/or sub-elements in the memory.
40. A system as claimed in any one of claims 28 to 39 including a user display * device and a user input device, the processor being configured to identify unknown parameters in the participant information; and to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device.
41. A system as claimed in any one of claims 28 to 39 the processor being configured to identify unknown parameters in the participant information; and to attempt to analyse the unknown parameters by grouping with known parameters and comparing the combined unknown and known parameters with a set of known compound parameters.
42. A system as claimed in claim 41 including a user display device and a user input device, the processor being configured, if the attempt to analyse the unknown parameter is unsuccessful, to display a user prompt on the user display device to allow a user to define or correct the unknown parameter using the user input device.
43. A system as claimed in any one of claims 28 to 42 the processor being configured to apply a series of masks or layers to the template in accordance with the analysed participant information.
44. A system as claimed in claim 43 the processor being configured to apply a mask or layer carrying a base colour to the template in accordance with a base colour identified in the participant information.
45. A system as claimed in claim 43 or 44, the processor being configured to apply a mask or layer for each pattern identified in the participant information.
46. A method as claimed in any one of claims 43 to 45 the processor being configured to import a mask or layer file in accordance with the analysed participant information and apply that mask or layer file to the template.
47. A system as claimed in any one of claims 28 to 46 wherein the received participant information is in the form of a manually input text description.
48. A system as claimed in claim 47 wherein the processor is configured to retrieve the manually input text description from the memory.
49. A system as claimed in any one of claims 28 to 48 the processor being configured to filter variants from the received participant information.
50. A system as claimed in claim 49 wherein the variants are variants of known words and/or separators.
51. A system as claimed in any one of claims 28 to 50 wherein the graphics are representative of a silk colouring and pattern for a jockey or harness driver and the wagering event is a horse race.
52. A system as claimed in any one of claims 28 to 50 wherein the graphics are representative of a dog's rug and the wagering event is a dog race.
53. A system as claimed in any one of claims 28 to 50 wherein the graphics are representative of a player's uniform and the wagering event is a sporting event.
54. A system as claimed in any one of claims 28 to 53, including one or more output devices from the group: displays, printers, electronic publishing systems and hardcopy printing systems.
55. A computer-implemented method of generating graphics, including the steps of: i. receiving information including colour and/or pattern information:
ii. identifying colour and/or pattern parameters within the information using a comparison to a set of known colour and/or pattern parameters;
iii. analysing the information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
iv. applying the analysed information to a template, to create a graphic representing the information; and
v. storing in memory the graphic, and/or the analysed information in a standard form.
56. A graphic generation system including:
i. memory; and
ii. a processor configured to receive information including colour and/or pattern information, and to perform the following steps:
e. identify colour and/or pattern parameters within the information using a comparison to a set of known colour and/or pattern parameters; f. analyse the information in accordance with a set of rules, the set of rules relating to allowed and/or likely interactions between colour and/or pattern parameters;
g. apply the analysed information to a template, to create a graphic representing the information; and
h. store in the memory the graphic, and/or the analysed information in a standard form.
57. A method as claimed in claim 1 substantially as herein described.
58. A system as claimed in claim 28 substantially as herein described.
59. A computer-implemented method of generating graphics representing participants in a wagering event, substantially as herein described with reference to Figures 1 and 2 of the accompanying drawings.
PCT/NZ2009/000155 2009-08-04 2009-08-04 Automatic graphic generation WO2011016731A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/NZ2009/000155 WO2011016731A1 (en) 2009-08-04 2009-08-04 Automatic graphic generation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/NZ2009/000155 WO2011016731A1 (en) 2009-08-04 2009-08-04 Automatic graphic generation

Publications (1)

Publication Number Publication Date
WO2011016731A1 true WO2011016731A1 (en) 2011-02-10

Family

ID=43544507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NZ2009/000155 WO2011016731A1 (en) 2009-08-04 2009-08-04 Automatic graphic generation

Country Status (1)

Country Link
WO (1) WO2011016731A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11376966B2 (en) 2016-07-19 2022-07-05 Auckland Uniservices Limited Electric vehicle detection for roadway wireless power transfer

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5831633A (en) * 1996-08-13 1998-11-03 Van Roy; Peter L. Designating, drawing and colorizing generated images by computer
US6098092A (en) * 1996-11-08 2000-08-01 Silicon Graphics, Inc. Server to dynamically generate graphics for the world wide web
WO2002019271A1 (en) * 2000-08-30 2002-03-07 Screenfriends Corporation A method of generating a graphic image
US6584465B1 (en) * 2000-02-25 2003-06-24 Eastman Kodak Company Method and system for search and retrieval of similar patterns

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5831633A (en) * 1996-08-13 1998-11-03 Van Roy; Peter L. Designating, drawing and colorizing generated images by computer
US6098092A (en) * 1996-11-08 2000-08-01 Silicon Graphics, Inc. Server to dynamically generate graphics for the world wide web
US6584465B1 (en) * 2000-02-25 2003-06-24 Eastman Kodak Company Method and system for search and retrieval of similar patterns
WO2002019271A1 (en) * 2000-08-30 2002-03-07 Screenfriends Corporation A method of generating a graphic image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11376966B2 (en) 2016-07-19 2022-07-05 Auckland Uniservices Limited Electric vehicle detection for roadway wireless power transfer

Similar Documents

Publication Publication Date Title
CN107729445B (en) HTML 5-based large text reading positioning and displaying method
CN107798321A (en) A kind of examination paper analysis method and computing device
JP5113108B2 (en) Note name identification device, note name identification method, and note name identification program
US20140330850A1 (en) Fast identification of complex strings in a data stream
CN107302645A (en) A kind of image processing apparatus and its image processing method
US20060122956A1 (en) Electronic document management apparatus and electronic document management program
CN109657114B (en) Method for extracting webpage semi-structured data
CN115953175B (en) Multiple anti-counterfeiting method, system, equipment and storage medium based on surface layer identification
Villegas et al. Overview of the ImageCLEF 2012 Scalable Web Image Annotation Task.
Goëau et al. Pl@ ntnet mobile 2014: Android port and new features
CN109726369A (en) A kind of intelligent template questions record Implementation Technology based on normative document
US8526744B2 (en) Document processing apparatus and computer readable medium
WO2011016731A1 (en) Automatic graphic generation
WO2009021996A3 (en) Method for fast up-scaling of color images and method for interpretation of digitally acquired documents
CN110851631B (en) Retrieval system
WO2013020325A1 (en) A method for retrieving associated information using an image
CN109064373B (en) Privacy protection method based on outsourcing image data entry
KR100820770B1 (en) Method for providing communication through image comment
CN100543726C (en) A kind of method and system of check and correction
CN107194390A (en) A kind of method of watermark in identification PDF document
JP2007241355A (en) Image processor and image processing program
CN108052525A (en) Obtain method, apparatus, storage medium and the electronic equipment of audio-frequency information
CN112613572B (en) Sample data obtaining method and device, electronic equipment and storage medium
WO2014031022A1 (en) Method and system for searching for copyright infringements in images
JP3750406B2 (en) Document filing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09848103

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09848103

Country of ref document: EP

Kind code of ref document: A1