How about a 2-paren version?
x = [1,2,3]
reduce(operator.add, [['insert', a] for a in x])
['insert', 1, 'insert', 2, 'insert', 3]
Or if one prefers the different parens on the other side:
reduce(operator.add, (['insert', a] for a in x))
['insert', 1, 'insert', 2, 'insert', 3]
Or, if you don't want to import the operator module:
sum((['insert', a] for a in x), [])
All of the solutions based on adding (concatenating) lists create an
unneeded temporary list for each addition except the last and run in
O(n**2) time. Starting with one list and appending or extending (which
does two appends here) is the 'proper' approach to get an O(N) algorithm.
This does not matter for n=3, but for n = 10000 it would.
expanded = []
expand = expand.append
for item in source:
expand('insert')
expand(item)
is hard to beat for clarity and time.
expanded = []
expand = expand.extend
for item in source:
expand(['insert', item])
might be faster if creating the list is faster than the second expand
call. Note that a typical lisp-like version would recursively traverse
source to nil and build expanded from tail to head by using the
equivalent of
return ['insert' item].extend(expanded)
Extend would be O(1) here also since it would at worst scan the new list
of length 2 for each of the items in the source.
def interleave(source):
for item in source:
yield 'insert'
yield item
list(interleave(source))
might also be faster since it avoids the repeated python level call. I
prefer it anyway as modern, idiomatic python in that it separates
interleaving from creating a list. In many situations, creating a list
from the interleaved stream will not be needed.