The invention relates to recognizing events performed on the user interface. This was performed using concurrent processing, which was considered obvious by the Examining Division.

The applicant argued that the feature accelerates event recognition in general. While the Board disagreed with such a general effect, the Board accepted that the feature does provide a technical effect, namely, allowing views to recognize user input events comprised of sub-events, which is not in that view. 

The board accepts the technical effect that concurrency provides a degree of flexibility in handling the mentioned conflicts between objects within a hierarchy and allows for a speed up at least in certain situations. Furthermore, even though this speed-up is not achieved for all possible scenarios in question, the Board acknowledges that the skilled person would understand the options and limits of concurrency. Thus, the subject-matter was considered inventive.

Here are the practical takeaways from the decision T 1644/20 (Event recognition/APPLE) of April 11, 2024, of the Technical Board of Appeal 3.5.06.

Key takeaways

To allow views to recognise user input events comprised by sub-events, not in that view, is a technical effect, as this adds new functionality to the user interface.

Concurrency provides a degree of flexibility in handling the mentioned conflicts between objects within a hierarchy, and allows for a speed up at least in certain situations. The board accepts this as a technical effect. Moreover, even though this speed up is not achieved for all possible scenarios in question, the board considers that the skilled person would understand the options and limits of concurrency.

The invention

The Board summarized the invention as follows:

2.2 The application relates to recognising user interface “events” comprising one or more “sub-events” in an electronic device. The user interface can comprise a display and/or input devices such as a touch-sensitive surface. In the case of a touch-sensitive display , an event can be a touch-based gesture. The device may recognise a set of touch-based gestures, such as a tap, double tap, swipe, pinch or “depinch”. A gesture, being an event, com­pri­ses sub-events. For instance, the “tap” gesture starts with a “finger down” sub-event.

2.3 The invention concerns an application including a view hierarchy. The lowest (finest) view in the hierarchy is that in which a user input sub-event, such as a “finger down” sub-event (see above), is detected. This “hit view” then determines which views are to be “actively involved” in recognising the sub-events. Each view is associated with software including one or more “event recognizers”, each having an event definition based on one or more sub-events. When an event recogniser detects an event, an event handler specifies an action for a target and sends the action to the target.

2.4 Figure 3A illustrates a view hierarchy consisting of an outermost view (302) encompassing the entire user inter­face and including subordinate views (search results panel 304, search text field 306 and home row 310). Subordinate views, for instance the search results panel (304), may themselves contain subordinate views; see subordinate view 305 “Maps view” for each search result.

2.5 A touch sub-event 301-1 is processed by outermost view 302 and, depending on its location, the subordinate views that it lies in, such as search results panel 304 and maps view 305, shown in figure 3A as 301-2 and 301-3. Hence the “actively involved” views of the touch sub-event (dotted circles 301-1, 301-2 and 301-3) shown in figure 3A include the views outermost view 302, search results panel 304 and maps view 305.

2.6 Figures 3B and 3C illustrate methods and structures related to event recognisers, the claims being directed to the case in figure 3B in which event handlers are associated with particular views within a hierarchy of views. The hit view determination module (314) establishes whether a sub-event has taken place within one or more views  and, if so, identi­fies the lowest view in the view hierarchy as the “hit-view”. The “actively involved” views receive not only the first sub-event but also all following ones related to the same touch source, even if the gesture leaves the hit view. For each of the actively involved views, one or more gesture recognisers, as illustrated in figures 4A, 4B and 5A to 5C (see [84-105]), use a state machine to identify a predefined sequence of sub-events to recognise a gesture such as a “scrolling event”.

  • Claim 1

Is it patentable?

The closest prior art (D1 -US 5 627 959 A) relates to displaying graphic objects on a screen and using a mouse to control a pointer on the screen to manipulate the objects, and the distinguishing feature was identified as follows:

6.3 The board finds that, in view of the above analysis, the subject-matter of claim 1 of the first auxiliary request differs from the disclosure of D1 in the following steps:

a. the hit view establishes multiple views in the view hierarchy as actively involved views in which the first sub-event is detected, wherein the actively involved views comprise the hit view;

b. delivering a respective sub-event of the input event to input event recognizers for each view of the multiple actively involved views within the view hierarchy and

c. at the input event recognizers for the actively involved views in the view hierarchy, concurrently processing the respective sub-event prior to concurrently processing a next sub-event in the sequence of sub-events of the input event at the input event recognizers for each actively involved view in the view hierarchy.

The applicant initially argued for a general technical effect, which the Board disagreed to:

6.4 Regarding the effect of these difference features, the board is not convinced that they accelerate event recognition in general. Concurrent processing, as opposed to parallel processing, cannot speed up the overall execution of all tasks. Notably if one uses non-parallel hardware, the overall execution time cannot improve if one carries out the individual tasks in an interleaved manner. One might achieve a speed increase in the situation in which the overall computation requires only some of the individual tasks to complete. For example, if, once the first (or the first few) event recogniser has recognised a series of sub-events as an event, all other recognisers were terminated. The board also notes that the invention does not change the variety of sub-event sequences that can be recognised. The ABO in D1 can recognise sub-events, and, in the same way, each recogniser in claim 1 can recognise an event comprising a single sub-event.

However, the Board agreed that the feature does have a different technical effect:

6.5 The board finds however that the difference features do have a technical effect, namely to allow views to recognise user input events comprised by sub-events not in that view. For instance, a scrolling event (see figure 5A) in the maps view (305) in figure 3A can be recognised by a recogniser of the search results panel view (304). This adds a new functionality to the user interface, since the maps view cannot itself scroll, but the search results panel view (304), being a list, can.

The Board then considered that the feature was not obvious to the skilled person starting from the closest prior art:

6.6 In the system of D1, it may happen that two objects above each other in the hierarchy are both able to process a mouse event. If the user intends to interact with one of these objects via a sequence of mouse events, it may happen that another object covers it and receives the mouse events against the user’s wishes. The board considers that this situation is undesirable and also that the user would recognise this situation as undesirable. Addressing this shortcoming of D1 would therefore have been obvious. The board considers that the skilled person would have addressed the problem of avoiding the problem that it is difficult, even impossible at times, for the user to interact with objects higher up in the hierarchy – or, equivalently, more in the background of the user interface.

6.7 The board sees several ways of approaching this problem. One is to instruct the user to always interact with a visible part of the object of interest. Another would be to define the events in such a way that there is no ambiguity as to which event is addressed to which object. Both may not be possible for practical reasons. Yet another option seems to be to send a sequence of events to the event recognisers of several or all objects covering one another. This choice already differs substantially from the solution proposed in D1. And it still lacks the claimed concurrent execution of event handlers. This concurrency however provides a degree of flexibility in handling the mentioned conflicts between objects within a hierarchy, and allows for a speed up at least in certain situations. The board accepts this as a technical effect. Moreover, even though this speed up is not achieved for all possible scenarios in question – in particular event definitions and conflict resolution “policies” – the board considers that the skilled person would understand the options and limits of concurrency.

6.8 The board agrees with the applicant that D1 contains no hint of a plurality of objects processing and possibly recognising user input sub-events concurrently, as set out in difference features “a” to “c” above.

6.9 The board concludes that the subject-matter of claim 1 of the first auxiliary request involves an inventive step in view of the disclosure of D1.

Therefore, the Board agreed that the subject-matter was inventive over the cited document and remitted the application to the first instance for further examination.

More information

You can read the full decision here: T 1644/20 (Event recognition/APPLE) of April 11, 2024, of the Technical Board of Appeal 3.5.06.

Stay in the loop

Never miss a beat by subscribing to the email newsletter. Please see our Privacy Policy.

Privacy policy *
* = Required field

Please share this article if you enjoyed it!