How Can I Slice A Pytorch Tensor With Another Tensor?
Solution 1:
Here you go (EDIT: you probably need to copy tensors to cpu using tensor=tensor.cpu()
before doing following operations):
index = tensor([[124, 583, 158, 529],
[172, 631, 206, 577]], device='cuda:0')
#create a concatenated list of ranges of indices you desire to sliceindexer = np.r_[tuple([np.s_[i:j] for (i,j) in zip(index[0,:],index[1,:])])]
#slice using numpy indexingsliced_inp = inp[:, indexer, :]
Here is how it works:
np.s_[i:j]
creates a slice object (simply a range) of indices from start=i
to end=j
.
np.r_[i:j, k:m]
creates a list ALL indices in slices (i,j)
and (k,m)
(You can pass more slices to np.r_
to concatenate them all together at once. This is an example of concatenating only two slices.)
Therefore, indexer
creates a list of ALL indices by concatenating a list of slices (each slice is a range of indices).
UPDATE: If you need to remove interval overlaps and sort intervals:
indexer = np.unique(indexer)
if you want to remove interval overlaps but not sort and keep original order (and first occurrences of overlaps)
uni = np.unique(indexer, return_index=True)[1]
indexer = [indexer[index] for index in sorted(uni)]
Solution 2:
inp = torch.randn(4, 1040, 161)
indices = torch.tensor([[124, 583, 158, 529],
[172, 631, 206, 577]])
k = zip(indices[0], indices[1])
for i,j in k:
print(inp[:,i:j,:])
You can implement it like this ... zip function helps to convert your indices tensor to list of tuples which you can use directly via for loop
Hope it helps you out....
Post a Comment for "How Can I Slice A Pytorch Tensor With Another Tensor?"