KLL Compiler
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
This repo is archived. You can view files and clone it, but cannot push or open issues/pull-requests.

expression.py 17KB

KLL Compiler Re-Write This was many months of efforts in re-designing how the KLL compiler should work. The major problem with the original compiler was how difficult it was to extend language wise. This lead to many delays in KLL 0.4 and 0.5 being implemented. The new design is a multi-staged compiler, where even tokenization occurs over multiple stages. This allows individual parsing and token regexes to be expressed more simply without affect other expressions. Another area of change is the concept of Contexts. In the original KLL compiler the idea of a cache assigned was "hacked" on when I realized the language was "broken" (after nearly finishing the compiler). Since assignment order is generally considered not to matter for keymappings, I created a "cached" assignment where the whole file is read into a sub-datastructure, then apply to the master datastructure. Unfortunately, this wasn't really all that clear, so it was annoying to work with. To remedy this, I created KLL Contexts, which contain information about a group of expressions. Not only can these groups can be merged with other Contexts, they have historical data about how they were generated allowing for errors very late in processing to be pin-pointed back to the offending kll file. Backends work nearly the same as they did before. However, all call-backs for capability evaluations have been removed. This makes the interface much cleaner as Contexts can only be symbolically merged now. (Previously datastructures did evaluation merges where the ScanCode or Capability was looked up right before passing to the backend, but this required additional information from the backend). Many of the old parsing and tokenization rules have been reused, along with the hid_dict.py code. The new design takes advantage of processor pools to handle multithreading where it makes sense. For example, all specified files are loaded into ram simulatenously rather than sparingly reading from. The reason for this is so that each Context always has all the information it requires at all times. kll - Program entry point (previously kll.py) - Very small now, does some setting up of command-line args - Most command-line args are specified by the corresponding processing stage common/channel.py - Pixel Channel container classes common/context.py - Context container classes - As is usual with other files, blank classes inherit a base class - These blank classes are identified by the class name itself to handle special behaviour - And if/when necessary functions are re-implemented - MergeConext class facilitates merging of contexts while maintaining lineage common/expression.py - Expression container classes * Expression base class * AssignmentExpression * NameAssociationExpression * DataAssociationExpression * MapExpression - These classes are used to store expressions after they have finished parsing and tokenization common/file.py - Container class for files being read by the KLL compiler common/emitter.py - Base class for all KLL emitters - TextEmitter for dealing with text file templates common/hid_dict.py - Slightly modified version of kll_lib/hid_dict.py common/id.py - Identification container classes - Used to indentify different types of elements used within the KLL language common/modifier.py - Container classes for animation and pixel change functions common/organization.py - Data structure merging container classes - Contains all the sub-datastructure classes as well - The Organization class handles the merge orchestration and expression insertion common/parse.py - Parsing rules for funcparserlib - Much of this file was taken from the original kll.py - Many changes to support the multi-stage processing and support KLL 0.5 common/position.py - Container class dealing with physical positions common/schedule.py - Container class dealing with scheduling and timing events common/stage.py - Contains ControlStage and main Stage classes * CompilerConfigurationStage * FileImportStage * PreprocessorStage * OperationClassificationStage * OperationSpecificsStage * OperationOrganizationStage * DataOrganziationStage * DataFinalizationStage * DataAnalysisStage * CodeGenerationStage * ReportGenerationStage - Each of these classes controls the life-cycle of each stage - If multi-threading is desired, it must be handled within the class * The next stage will not start until the current stage is finished - Errors are handled such that as many errors as possible are recorded before forcing an exit * The exit is handled at the end of each stage if necessary - Command-line arguments for each stage can be defined if necessary (they are given their own grouping) - Each stage can pull variables and functions from other stages if necessary using a name lookup * This means you don't have to worry about over-arching datastructures emitters/emitters.py - Container class for KLL emitters - Handles emitter setup and selection emitters/kiibohd/kiibohd.py - kiibohd .h file KLL emitter - Re-uses some backend code from the original KLL compiler funcparserlib/parser.py - Added debug mode control examples/assignment.kll examples/defaultMapExample.kll examples/example.kll examples/hhkbpro2.kll examples/leds.kll examples/mapping.kll examples/simple1.kll examples/simple2.kll examples/simpleExample.kll examples/state_scheduling.kll - Updating/Adding rules for new compiler and KLL 0.4 + KLL 0.5 support
7 years ago
KLL Compiler Re-Write This was many months of efforts in re-designing how the KLL compiler should work. The major problem with the original compiler was how difficult it was to extend language wise. This lead to many delays in KLL 0.4 and 0.5 being implemented. The new design is a multi-staged compiler, where even tokenization occurs over multiple stages. This allows individual parsing and token regexes to be expressed more simply without affect other expressions. Another area of change is the concept of Contexts. In the original KLL compiler the idea of a cache assigned was "hacked" on when I realized the language was "broken" (after nearly finishing the compiler). Since assignment order is generally considered not to matter for keymappings, I created a "cached" assignment where the whole file is read into a sub-datastructure, then apply to the master datastructure. Unfortunately, this wasn't really all that clear, so it was annoying to work with. To remedy this, I created KLL Contexts, which contain information about a group of expressions. Not only can these groups can be merged with other Contexts, they have historical data about how they were generated allowing for errors very late in processing to be pin-pointed back to the offending kll file. Backends work nearly the same as they did before. However, all call-backs for capability evaluations have been removed. This makes the interface much cleaner as Contexts can only be symbolically merged now. (Previously datastructures did evaluation merges where the ScanCode or Capability was looked up right before passing to the backend, but this required additional information from the backend). Many of the old parsing and tokenization rules have been reused, along with the hid_dict.py code. The new design takes advantage of processor pools to handle multithreading where it makes sense. For example, all specified files are loaded into ram simulatenously rather than sparingly reading from. The reason for this is so that each Context always has all the information it requires at all times. kll - Program entry point (previously kll.py) - Very small now, does some setting up of command-line args - Most command-line args are specified by the corresponding processing stage common/channel.py - Pixel Channel container classes common/context.py - Context container classes - As is usual with other files, blank classes inherit a base class - These blank classes are identified by the class name itself to handle special behaviour - And if/when necessary functions are re-implemented - MergeConext class facilitates merging of contexts while maintaining lineage common/expression.py - Expression container classes * Expression base class * AssignmentExpression * NameAssociationExpression * DataAssociationExpression * MapExpression - These classes are used to store expressions after they have finished parsing and tokenization common/file.py - Container class for files being read by the KLL compiler common/emitter.py - Base class for all KLL emitters - TextEmitter for dealing with text file templates common/hid_dict.py - Slightly modified version of kll_lib/hid_dict.py common/id.py - Identification container classes - Used to indentify different types of elements used within the KLL language common/modifier.py - Container classes for animation and pixel change functions common/organization.py - Data structure merging container classes - Contains all the sub-datastructure classes as well - The Organization class handles the merge orchestration and expression insertion common/parse.py - Parsing rules for funcparserlib - Much of this file was taken from the original kll.py - Many changes to support the multi-stage processing and support KLL 0.5 common/position.py - Container class dealing with physical positions common/schedule.py - Container class dealing with scheduling and timing events common/stage.py - Contains ControlStage and main Stage classes * CompilerConfigurationStage * FileImportStage * PreprocessorStage * OperationClassificationStage * OperationSpecificsStage * OperationOrganizationStage * DataOrganziationStage * DataFinalizationStage * DataAnalysisStage * CodeGenerationStage * ReportGenerationStage - Each of these classes controls the life-cycle of each stage - If multi-threading is desired, it must be handled within the class * The next stage will not start until the current stage is finished - Errors are handled such that as many errors as possible are recorded before forcing an exit * The exit is handled at the end of each stage if necessary - Command-line arguments for each stage can be defined if necessary (they are given their own grouping) - Each stage can pull variables and functions from other stages if necessary using a name lookup * This means you don't have to worry about over-arching datastructures emitters/emitters.py - Container class for KLL emitters - Handles emitter setup and selection emitters/kiibohd/kiibohd.py - kiibohd .h file KLL emitter - Re-uses some backend code from the original KLL compiler funcparserlib/parser.py - Added debug mode control examples/assignment.kll examples/defaultMapExample.kll examples/example.kll examples/hhkbpro2.kll examples/leds.kll examples/mapping.kll examples/simple1.kll examples/simple2.kll examples/simpleExample.kll examples/state_scheduling.kll - Updating/Adding rules for new compiler and KLL 0.4 + KLL 0.5 support
7 years ago
KLL Compiler Re-Write This was many months of efforts in re-designing how the KLL compiler should work. The major problem with the original compiler was how difficult it was to extend language wise. This lead to many delays in KLL 0.4 and 0.5 being implemented. The new design is a multi-staged compiler, where even tokenization occurs over multiple stages. This allows individual parsing and token regexes to be expressed more simply without affect other expressions. Another area of change is the concept of Contexts. In the original KLL compiler the idea of a cache assigned was "hacked" on when I realized the language was "broken" (after nearly finishing the compiler). Since assignment order is generally considered not to matter for keymappings, I created a "cached" assignment where the whole file is read into a sub-datastructure, then apply to the master datastructure. Unfortunately, this wasn't really all that clear, so it was annoying to work with. To remedy this, I created KLL Contexts, which contain information about a group of expressions. Not only can these groups can be merged with other Contexts, they have historical data about how they were generated allowing for errors very late in processing to be pin-pointed back to the offending kll file. Backends work nearly the same as they did before. However, all call-backs for capability evaluations have been removed. This makes the interface much cleaner as Contexts can only be symbolically merged now. (Previously datastructures did evaluation merges where the ScanCode or Capability was looked up right before passing to the backend, but this required additional information from the backend). Many of the old parsing and tokenization rules have been reused, along with the hid_dict.py code. The new design takes advantage of processor pools to handle multithreading where it makes sense. For example, all specified files are loaded into ram simulatenously rather than sparingly reading from. The reason for this is so that each Context always has all the information it requires at all times. kll - Program entry point (previously kll.py) - Very small now, does some setting up of command-line args - Most command-line args are specified by the corresponding processing stage common/channel.py - Pixel Channel container classes common/context.py - Context container classes - As is usual with other files, blank classes inherit a base class - These blank classes are identified by the class name itself to handle special behaviour - And if/when necessary functions are re-implemented - MergeConext class facilitates merging of contexts while maintaining lineage common/expression.py - Expression container classes * Expression base class * AssignmentExpression * NameAssociationExpression * DataAssociationExpression * MapExpression - These classes are used to store expressions after they have finished parsing and tokenization common/file.py - Container class for files being read by the KLL compiler common/emitter.py - Base class for all KLL emitters - TextEmitter for dealing with text file templates common/hid_dict.py - Slightly modified version of kll_lib/hid_dict.py common/id.py - Identification container classes - Used to indentify different types of elements used within the KLL language common/modifier.py - Container classes for animation and pixel change functions common/organization.py - Data structure merging container classes - Contains all the sub-datastructure classes as well - The Organization class handles the merge orchestration and expression insertion common/parse.py - Parsing rules for funcparserlib - Much of this file was taken from the original kll.py - Many changes to support the multi-stage processing and support KLL 0.5 common/position.py - Container class dealing with physical positions common/schedule.py - Container class dealing with scheduling and timing events common/stage.py - Contains ControlStage and main Stage classes * CompilerConfigurationStage * FileImportStage * PreprocessorStage * OperationClassificationStage * OperationSpecificsStage * OperationOrganizationStage * DataOrganziationStage * DataFinalizationStage * DataAnalysisStage * CodeGenerationStage * ReportGenerationStage - Each of these classes controls the life-cycle of each stage - If multi-threading is desired, it must be handled within the class * The next stage will not start until the current stage is finished - Errors are handled such that as many errors as possible are recorded before forcing an exit * The exit is handled at the end of each stage if necessary - Command-line arguments for each stage can be defined if necessary (they are given their own grouping) - Each stage can pull variables and functions from other stages if necessary using a name lookup * This means you don't have to worry about over-arching datastructures emitters/emitters.py - Container class for KLL emitters - Handles emitter setup and selection emitters/kiibohd/kiibohd.py - kiibohd .h file KLL emitter - Re-uses some backend code from the original KLL compiler funcparserlib/parser.py - Added debug mode control examples/assignment.kll examples/defaultMapExample.kll examples/example.kll examples/hhkbpro2.kll examples/leds.kll examples/mapping.kll examples/simple1.kll examples/simple2.kll examples/simpleExample.kll examples/state_scheduling.kll - Updating/Adding rules for new compiler and KLL 0.4 + KLL 0.5 support
7 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672
  1. #!/usr/bin/env python3
  2. '''
  3. KLL Expression Container
  4. '''
  5. # Copyright (C) 2016 by Jacob Alexander
  6. #
  7. # This file is free software: you can redistribute it and/or modify
  8. # it under the terms of the GNU General Public License as published by
  9. # the Free Software Foundation, either version 3 of the License, or
  10. # (at your option) any later version.
  11. #
  12. # This file is distributed in the hope that it will be useful,
  13. # but WITHOUT ANY WARRANTY; without even the implied warranty of
  14. # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
  15. # GNU General Public License for more details.
  16. #
  17. # You should have received a copy of the GNU General Public License
  18. # along with this file. If not, see <http://www.gnu.org/licenses/>.
  19. ### Imports ###
  20. import copy
  21. from common.id import CapId
  22. ### Decorators ###
  23. ## Print Decorator Variables
  24. ERROR = '\033[5;1;31mERROR\033[0m:'
  25. WARNING = '\033[5;1;33mWARNING\033[0m:'
  26. ### Classes ###
  27. class Expression:
  28. '''
  29. Container class for KLL expressions
  30. '''
  31. def __init__( self, lparam, operator, rparam, context ):
  32. '''
  33. Initialize expression container
  34. @param lparam: LOperatorData token
  35. @param operator: Operator token
  36. @param rparam: ROperatorData token
  37. @param context: Parent context of expression
  38. '''
  39. # First stage/init
  40. self.lparam_token = lparam
  41. self.operator_token = operator
  42. self.rparam_token = rparam
  43. self.context = context # TODO, set multiple contexts for later stages
  44. # Second stage
  45. self.lparam_sub_tokens = []
  46. self.rparam_sub_tokens = []
  47. # Mutate class into the desired type
  48. self.__class__ = {
  49. '=>' : NameAssociationExpression,
  50. '<=' : DataAssociationExpression,
  51. '=' : AssignmentExpression,
  52. ':' : MapExpression,
  53. }[ self.operator_type() ]
  54. def operator_type( self ):
  55. '''
  56. Determine which base operator this operator is of
  57. All : (map) expressions are tokenized/parsed the same way
  58. @return Base string representation of the operator
  59. '''
  60. if ':' in self.operator_token.value:
  61. return ':'
  62. return self.operator_token.value
  63. def final_tokens( self, no_filter=False ):
  64. '''
  65. Return the final list of tokens, must complete the second stage first
  66. @param no_filter: If true, do not filter out Space tokens
  67. @return Finalized list of tokens
  68. '''
  69. ret = self.lparam_sub_tokens + [ self.operator_token ] + self.rparam_sub_tokens
  70. if not no_filter:
  71. ret = [ x for x in ret if x.type != 'Space' ]
  72. return ret
  73. def regen_str( self ):
  74. '''
  75. Re-construct the string based off the original set of tokens
  76. <lparam><operator><rparam>;
  77. '''
  78. return "{0}{1}{2};".format(
  79. self.lparam_token.value,
  80. self.operator_token.value,
  81. self.rparam_token.value,
  82. )
  83. def point_chars( self, pos_list ):
  84. '''
  85. Using the regenerated string, point to a given list of characters
  86. Used to indicate where a possible issue/syntax error is
  87. @param pos_list: List of character indices
  88. i.e.
  89. > U"A" : : U"1";
  90. > ^
  91. '''
  92. out = "\t{0}\n\t".format( self.regen_str() )
  93. # Place a ^ character at the given locations
  94. curpos = 1
  95. for pos in sorted( pos_list ):
  96. # Pad spaces, then add a ^
  97. out += ' ' * (pos - curpos)
  98. out += '^'
  99. curpos += pos
  100. return out
  101. def rparam_start( self ):
  102. '''
  103. Starting positing char of rparam_token in a regen_str
  104. '''
  105. return len( self.lparam_token.value ) + len( self.operator_token.value )
  106. def __repr__( self ):
  107. # Build string representation based off of what has been set
  108. # lparam, operator and rparam are always set
  109. out = "Expression: {0}{1}{2}".format(
  110. self.lparam_token.value,
  111. self.operator_token.value,
  112. self.rparam_token.value,
  113. )
  114. # TODO - Add more depending on what has been set
  115. return out
  116. def unique_keys( self ):
  117. '''
  118. Generates a list of unique identifiers for the expression that is mergeable
  119. with other functional equivalent expressions.
  120. This method should never get called directly as a generic Expression
  121. '''
  122. return [ ('UNKNOWN KEY', 'UNKNOWN EXPRESSION') ]
  123. class AssignmentExpression( Expression ):
  124. '''
  125. Container class for assignment KLL expressions
  126. '''
  127. type = None
  128. name = None
  129. pos = None
  130. value = None
  131. ## Setters ##
  132. def array( self, name, pos, value ):
  133. '''
  134. Assign array assignment parameters to expression
  135. @param name: Name of variable
  136. @param pos: Array position of the value (if None, overwrite the entire array)
  137. @param value: Value of the array, if pos is specified, this is the value of an element
  138. @return: True if parsing was successful
  139. '''
  140. self.type = 'Array'
  141. self.name = name
  142. self.pos = pos
  143. self.value = value
  144. # If pos is not none, flatten
  145. if pos is not None:
  146. self.value = "".join( str( x ) for x in self.value )
  147. return True
  148. def merge_array( self, new_expression=None ):
  149. '''
  150. Merge arrays, used for position assignments
  151. Merges unconditionally, make sure this is what you want to do first
  152. If no additional array is specified, just "cap-off" array.
  153. This does a proper array expansion into a python list.
  154. @param new_expression: AssignmentExpression type array, ignore if None
  155. '''
  156. # First, check if base expression needs to be capped
  157. if self.pos is not None:
  158. # Generate a new string array
  159. new_value = [""] * self.pos
  160. # Append the old contents to the list
  161. new_value.append( self.value )
  162. self.value = new_value
  163. # Clear pos, to indicate that array has been capped
  164. self.pos = None
  165. # Next, if a new_expression has been specified, merge in
  166. if new_expression is not None and new_expression.pos is not None:
  167. # Check if we need to extend the list
  168. new_size = new_expression.pos + 1 - len( self.value )
  169. if new_size > 0:
  170. self.value.extend( [""] * new_size )
  171. # Assign value to array
  172. self.value[ new_expression.pos ] = new_expression.value
  173. def variable( self, name, value ):
  174. '''
  175. Assign variable assignment parameters to expression
  176. @param name: Name of variable
  177. @param value: Value of variable
  178. @return: True if parsing was successful
  179. '''
  180. self.type = 'Variable'
  181. self.name = name
  182. self.value = value
  183. # Flatten value, often a list of various token types
  184. self.value = "".join( str( x ) for x in self.value )
  185. return True
  186. def __repr__( self ):
  187. if self.type == 'Variable':
  188. return "{0} = {1};".format( self.name, self.value )
  189. elif self.type == 'Array':
  190. # Output KLL style array, double quoted elements, space-separated
  191. if isinstance( self.value, list ):
  192. output = "{0}[] =".format( self.name )
  193. for value in self.value:
  194. output += ' "{0}"'.format( value )
  195. output += ";"
  196. return output
  197. # Single array assignment
  198. else:
  199. return "{0}[{1}] = {2};".format( self.name, self.pos, self.value )
  200. return "ASSIGNMENT UNKNOWN"
  201. def unique_keys( self ):
  202. '''
  203. Generates a list of unique identifiers for the expression that is mergeable
  204. with other functional equivalent expressions.
  205. '''
  206. return [ ( self.name, self ) ]
  207. class NameAssociationExpression( Expression ):
  208. '''
  209. Container class for name association KLL expressions
  210. '''
  211. type = None
  212. name = None
  213. association = None
  214. ## Setters ##
  215. def capability( self, name, association, parameters ):
  216. '''
  217. Assign a capability C function name association
  218. @param name: Name of capability
  219. @param association: Name of capability in target backend output
  220. @return: True if parsing was successful
  221. '''
  222. self.type = 'Capability'
  223. self.name = name
  224. self.association = CapId( association, 'Definition', parameters )
  225. return True
  226. def define( self, name, association ):
  227. '''
  228. Assign a define C define name association
  229. @param name: Name of variable
  230. @param association: Name of association in target backend output
  231. @return: True if parsing was successful
  232. '''
  233. self.type = 'Define'
  234. self.name = name
  235. self.association = association
  236. return True
  237. def __repr__( self ):
  238. return "{0} <= {1};".format( self.name, self.association )
  239. def unique_keys( self ):
  240. '''
  241. Generates a list of unique identifiers for the expression that is mergeable
  242. with other functional equivalent expressions.
  243. '''
  244. return [ ( self.name, self ) ]
  245. class DataAssociationExpression( Expression ):
  246. '''
  247. Container class for data association KLL expressions
  248. '''
  249. type = None
  250. association = None
  251. value = None
  252. ## Setters ##
  253. def animation( self, animations, animation_modifiers ):
  254. '''
  255. Animation definition and configuration
  256. @return: True if parsing was successful
  257. '''
  258. self.type = 'Animation'
  259. self.association = animations
  260. self.value = animation_modifiers
  261. return True
  262. def animationFrame( self, animation_frames, pixel_modifiers ):
  263. '''
  264. Pixel composition of an Animation Frame
  265. @return: True if parsing was successful
  266. '''
  267. self.type = 'AnimationFrame'
  268. self.association = animation_frames
  269. self.value = pixel_modifiers
  270. return True
  271. def pixelPosition( self, pixels, position ):
  272. '''
  273. Pixel Positioning
  274. @return: True if parsing was successful
  275. '''
  276. for pixel in pixels:
  277. pixel.setPosition( position )
  278. self.type = 'PixelPosition'
  279. self.association = pixels
  280. return True
  281. def scanCodePosition( self, scancodes, position ):
  282. '''
  283. Scan Code to Position Mapping
  284. Note: Accepts lists of scan codes
  285. Alone this isn't useful, but you can assign rows and columns using ranges instead of individually
  286. @return: True if parsing was successful
  287. '''
  288. for scancode in scancodes:
  289. scancode.setPosition( position )
  290. self.type = 'ScanCodePosition'
  291. self.association = scancodes
  292. return True
  293. def __repr__( self ):
  294. if self.type in ['PixelPosition', 'ScanCodePosition']:
  295. output = ""
  296. for index, association in enumerate( self.association ):
  297. if index > 0:
  298. output += "; "
  299. output += "{0}".format( association )
  300. return "{0};".format( output )
  301. return "{0} <= {1};".format( self.association, self.value )
  302. def unique_keys( self ):
  303. '''
  304. Generates a list of unique identifiers for the expression that is mergeable
  305. with other functional equivalent expressions.
  306. '''
  307. keys = []
  308. # Positions require a bit more introspection to get the unique keys
  309. if self.type in ['PixelPosition', 'ScanCodePosition']:
  310. for index, key in enumerate( self.association ):
  311. uniq_expr = self
  312. # If there is more than one key, copy the expression
  313. # and remove the non-related variants
  314. if len( self.association ) > 1:
  315. uniq_expr = copy.copy( self )
  316. # Isolate variant by index
  317. uniq_expr.association = [ uniq_expr.association[ index ] ]
  318. keys.append( ( "{0}".format( key.unique_key() ), uniq_expr ) )
  319. # AnimationFrames are already list of keys
  320. # TODO Reorder frame assignments to dedup function equivalent mappings
  321. elif self.type in ['AnimationFrame']:
  322. for index, key in enumerate( self.association ):
  323. uniq_expr = self
  324. # If there is more than one key, copy the expression
  325. # and remove the non-related variants
  326. if len( self.association ) > 1:
  327. uniq_expr = copy.copy( self )
  328. # Isolate variant by index
  329. uniq_expr.association = [ uniq_expr.association[ index ] ]
  330. keys.append( ( "{0}".format( key ), uniq_expr ) )
  331. # Otherwise treat as a single element
  332. else:
  333. keys = [ ( "{0}".format( self.association ), self ) ]
  334. # Remove any duplicate keys
  335. # TODO Stat? Might be at neat report about how many duplicates were squashed
  336. keys = list( set( keys ) )
  337. return keys
  338. class MapExpression( Expression ):
  339. '''
  340. Container class for KLL map expressions
  341. '''
  342. type = None
  343. triggers = None
  344. operator = None
  345. results = None
  346. animation = None
  347. animation_frame = None
  348. pixels = None
  349. position = None
  350. ## Setters ##
  351. def scanCode( self, triggers, operator, results ):
  352. '''
  353. Scan Code mapping
  354. @param triggers: Sequence of combos of ranges of namedtuples
  355. @param operator: Type of map operation
  356. @param results: Sequence of combos of ranges of namedtuples
  357. @return: True if parsing was successful
  358. '''
  359. self.type = 'ScanCode'
  360. self.triggers = triggers
  361. self.operator = operator
  362. self.results = results
  363. return True
  364. def usbCode( self, triggers, operator, results ):
  365. '''
  366. USB Code mapping
  367. @param triggers: Sequence of combos of ranges of namedtuples
  368. @param operator: Type of map operation
  369. @param results: Sequence of combos of ranges of namedtuples
  370. @return: True if parsing was successful
  371. '''
  372. self.type = 'USBCode'
  373. self.triggers = triggers
  374. self.operator = operator
  375. self.results = results
  376. return True
  377. def animationTrigger( self, animation, operator, results ):
  378. '''
  379. Animation Trigger mapping
  380. @param animation: Animation trigger of result
  381. @param operator: Type of map operation
  382. @param results: Sequence of combos of ranges of namedtuples
  383. @return: True if parsing was successful
  384. '''
  385. self.type = 'Animation'
  386. self.animation = animation
  387. self.triggers = animation
  388. self.operator = operator
  389. self.results = results
  390. return True
  391. def pixelChannels( self, pixelmap, trigger ):
  392. '''
  393. Pixel Channel Composition
  394. @return: True if parsing was successful
  395. '''
  396. self.type = 'PixelChannel'
  397. self.pixel = pixelmap
  398. self.position = trigger
  399. return True
  400. def sequencesOfCombosOfIds( self, expression_param ):
  401. '''
  402. Prettified Sequence of Combos of Identifiers
  403. @param expression_param: Trigger or Result parameter of an expression
  404. Scan Code Example
  405. [[[S10, S16], [S42]], [[S11, S16], [S42]]] -> (S10 + S16, S42)|(S11 + S16, S42)
  406. '''
  407. output = ""
  408. # Sometimes during error cases, might be None
  409. if expression_param is None:
  410. return output
  411. # Iterate over each trigger/result variants (expanded from ranges), each one is a sequence
  412. for index, sequence in enumerate( expression_param ):
  413. if index > 0:
  414. output += "|"
  415. output += "("
  416. # Iterate over each combo (element of the sequence)
  417. for index, combo in enumerate( sequence ):
  418. if index > 0:
  419. output += ", "
  420. # Iterate over each trigger identifier
  421. for index, identifier in enumerate( combo ):
  422. if index > 0:
  423. output += " + "
  424. output += "{0}".format( identifier )
  425. output += ")"
  426. return output
  427. def elems( self ):
  428. '''
  429. Return number of trigger and result elements
  430. Useful for determining if this is a trigger macro (2+)
  431. Should always return at least (1,1) unless it's an invalid calculation
  432. @return: ( triggers, results )
  433. '''
  434. elems = [ 0, 0 ]
  435. # XXX Needed?
  436. if self.type == 'PixelChannel':
  437. return tuple( elems )
  438. # Iterate over each trigger variant (expanded from ranges), each one is a sequence
  439. for sequence in self.triggers:
  440. # Iterate over each combo (element of the sequence)
  441. for combo in sequence:
  442. # Just measure the size of the combo
  443. elems[0] += len( combo )
  444. # Iterate over each result variant (expanded from ranges), each one is a sequence
  445. for sequence in self.results:
  446. # Iterate over each combo (element of the sequence)
  447. for combo in sequence:
  448. # Just measure the size of the combo
  449. elems[1] += len( combo )
  450. return tuple( elems )
  451. def trigger_str( self ):
  452. '''
  453. String version of the trigger
  454. Used for sorting
  455. '''
  456. # Pixel Channel Mapping doesn't follow the same pattern
  457. if self.type == 'PixelChannel':
  458. return "{0}".format( self.pixel )
  459. return "{0}".format(
  460. self.sequencesOfCombosOfIds( self.triggers ),
  461. )
  462. def result_str( self ):
  463. '''
  464. String version of the result
  465. Used for sorting
  466. '''
  467. # Pixel Channel Mapping doesn't follow the same pattern
  468. if self.type == 'PixelChannel':
  469. return "{0}".format( self.position )
  470. return "{0}".format(
  471. self.sequencesOfCombosOfIds( self.results ),
  472. )
  473. def __repr__( self ):
  474. # Pixel Channel Mapping doesn't follow the same pattern
  475. if self.type == 'PixelChannel':
  476. return "{0} : {1};".format( self.pixel, self.position )
  477. return "{0} {1} {2};".format(
  478. self.sequencesOfCombosOfIds( self.triggers ),
  479. self.operator,
  480. self.sequencesOfCombosOfIds( self.results ),
  481. )
  482. def unique_keys( self ):
  483. '''
  484. Generates a list of unique identifiers for the expression that is mergeable
  485. with other functional equivalent expressions.
  486. TODO: This function should re-order combinations to generate the key.
  487. The final generated combo will be in the original order.
  488. '''
  489. keys = []
  490. # Pixel Channel only has key per mapping
  491. if self.type == 'PixelChannel':
  492. keys = [ ( "{0}".format( self.pixel ), self ) ]
  493. # Split up each of the keys
  494. else:
  495. # Iterate over each trigger/result variants (expanded from ranges), each one is a sequence
  496. for index, sequence in enumerate( self.triggers ):
  497. key = ""
  498. uniq_expr = self
  499. # If there is more than one key, copy the expression
  500. # and remove the non-related variants
  501. if len( self.triggers ) > 1:
  502. uniq_expr = copy.copy( self )
  503. # Isolate variant by index
  504. uniq_expr.triggers = [ uniq_expr.triggers[ index ] ]
  505. # Iterate over each combo (element of the sequence)
  506. for index, combo in enumerate( sequence ):
  507. if index > 0:
  508. key += ", "
  509. # Iterate over each trigger identifier
  510. for index, identifier in enumerate( combo ):
  511. if index > 0:
  512. key += " + "
  513. key += "{0}".format( identifier )
  514. # Add key to list
  515. keys.append( ( key, uniq_expr ) )
  516. # Remove any duplicate keys
  517. # TODO Stat? Might be at neat report about how many duplicates were squashed
  518. keys = list( set( keys ) )
  519. return keys