-
Notifications
You must be signed in to change notification settings - Fork 75.2k
Closed
Labels
Description
Issue type
Bug
Have you reproduced the bug with TensorFlow Nightly?
Yes
Source
source
TensorFlow version
nightly 20250225
Custom code
Yes
OS platform and distribution
No response
Mobile device
No response
Python version
No response
Bazel version
No response
GCC/compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current behavior?
pass the compilation
Standalone code to reproduce the issue
import os
import tensorflow
import tensorflow as tf
import numpy as np
os.environ["CUDA_VISIBLE_DEVICES"] = "-1"
class ComplexModel(tf.keras.Model):
def __init__(self):
super(ComplexModel, self).__init__()
def call(self, x):
x_sparse = tf.sparse.from_dense(x)
x = tf.sparse.minimum(x_sparse, tf.sparse.from_dense(tf.ones_like(x)))
return x
model = ComplexModel()
x = tf.constant([[1.0, 2.0], [3.0, 4.0]], dtype=tf.float32)
inputs = [x]
model(*inputs)
print("succeed on eager")
class ComplexModel(tf.keras.Model):
def __init__(self):
super(ComplexModel, self).__init__()
@tf.function(jit_compile=True)
def call(self, x):
x_sparse = tf.sparse.from_dense(x)
x = tf.sparse.minimum(x_sparse, tf.sparse.from_dense(tf.ones_like(x)))
return x
model = ComplexModel()
model(*inputs)
print("succeed on XLA")Relevant log output
succeed on eager
tf2xla conversion failed while converting __inference_call_33[_XlaMustCompile=true,config_proto=13561319589895757934,executor_type=11160318154034397263]. Run with TF_DUMP_GRAPH_PREFIX=/path/to/dump/dir and --vmodule=xla_compiler=2 to obtain a dump of the compiled functions. [Op:__inference_call_33]