You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
like np.block does, but if it comes naturally from the implementation (which is likely and I guess is
probably the case for np.block), I do not think it is worth it to add an explicit check to disallow it.
I don't see any real value of providing both concat and block in the API, as concat could simply be a special case for block or vice-versa. There are edge cases with non-LArray types where it would not be equal, but I do not think they would justify having two functions instead of one.
I have a working generalized concat in one of my branches. It needs a bit of cleanup but the hardest question to answer is now about the API we want to offer: block, concat, or merge this into stack (ie, if the "stacking axes" already exist, do a concat, otherwise stack/create the axes)
Uh oh!
There was an error while loading. Please reload this page.
The goal is to produce more easily arrays like this:
I don't know what would be the best syntax for this. Generalize stack, concat, from_lists?
Assuming blocking along existing axes (ie "generalized concat"):
See
https://docs.scipy.org/doc/numpy/reference/generated/numpy.block.html#numpy.block
Notes:
I do not want to do any effort to support:
like np.block does, but if it comes naturally from the implementation (which is likely and I guess is
probably the case for np.block), I do not think it is worth it to add an explicit check to disallow it.
I don't see any real value of providing both concat and block in the API, as concat could simply be a special case for block or vice-versa. There are edge cases with non-LArray types where it would not be equal, but I do not think they would justify having two functions instead of one.
It would be nifty if we could write:
But that would be a bit tricky to implement (even if theoretically possible).
The text was updated successfully, but these errors were encountered: