Flashcards for topic Methods Common to All Objects
Why is it impossible to extend an instantiable class, add a value component, and preserve the equals contract simultaneously?
This is a fundamental problem with equivalence relations in object-oriented languages:
Adding a value component creates a dilemma:
Attempted solutions all fail:
Recommended workaround: Use composition instead of inheritance - give the "extended" class a private field of the base class type and a view method to expose it when needed.
When should you NOT override the equals method in a class? List the specific conditions.
You should NOT override the equals method when any of these conditions apply:
Each instance is inherently unique - Such as with active entities (e.g., Thread) rather than value objects. The default Object.equals implementation is appropriate.
No need for "logical equality" test - When clients don't need to compare instances for logical equivalence (e.g., java.util.regex.Pattern doesn't need to compare if two patterns represent the same regex).
Superclass has already overridden equals appropriately - If a parent class already implements equals in a way that works for your class (e.g., most Set implementations inherit equals from AbstractSet).
Class is private or package-private and equals will never be invoked - When you're certain the method won't be called, though you might defensively override it to throw AssertionError.
Class uses instance control (Item 1) - If at most one object exists with each value (like Enum types), logical equality is the same as object identity.
What problem occurs when you override equals() but don't override hashCode(), and how does this affect collections like HashMap?
When equals() is overridden without hashCode(), equal objects can have different hash codes, violating the hashCode contract that states "equal objects must have equal hash codes."
Consequences in collections:
Example:
// Without proper hashCode() Map<PhoneNumber, String> m = new HashMap<>(); m.put(new PhoneNumber(707, 867, 5309), "Jenny"); m.get(new PhoneNumber(707, 867, 5309)); // Returns null!
The get() method looks in a different hash bucket than where put() stored the entry.
Provide a complete recipe for implementing a high-quality hashCode() method.
Recipe for a high-quality hashCode() implementation:
Declare an int variable result
, initialize it to the hash code of the first significant field.
For each remaining significant field f
:
a. Compute hash code c
for the field:
Type.hashCode(f)
(e.g., Integer.hashCode(f)
)f.hashCode()
(use 0 if null)Arrays.hashCode()
b. Combine into result: result = 31 * result + c;
Return result
Example for PhoneNumber:
@Override public int hashCode() { int result = Short.hashCode(areaCode); result = 31 * result + Short.hashCode(prefix); result = 31 * result + Short.hashCode(lineNum); return result; }
Note: Only include fields used in equals() comparisons. Derived fields can be excluded if they're calculated from included fields.
Why is it impossible to properly extend a class that implements Comparable
with a new value component while preserving the compareTo
contract?
Extending a Comparable
class with a new value component breaks the contract because:
Violation of symmetry rule: If a subclass instance S is compared to a superclass instance P:
Violation of transitivity: Consider objects A, B, C where:
Solution: Instead of extending the class, create a new class containing an instance of the original class and provide a "view" method to retrieve it when needed.
What are the specific reasons why immutable classes should generally avoid providing a clone()
method?
Immutable classes should avoid providing clone()
methods because:
For immutable objects, alternatives to cloning include:
Example of what NOT to do:
// Unnecessary and incorrect for immutable classes: @Override public PhoneNumber clone() { try { return (PhoneNumber) super.clone(); } catch (CloneNotSupportedException e) { throw new AssertionError(); } }
How should you implement a non-recursive deepCopy()
method for a linked list to avoid stack overflow when cloning a deeply nested structure like in a hash table?
// Iteratively copy the linked list headed by this Entry Entry deepCopy() { // Create first node Entry result = new Entry(key, value, null); // Use iteration instead of recursion to avoid stack overflow Entry current = result; Entry original = this.next; // Iterate through the original list, creating new nodes while (original != null) { current.next = new Entry(original.key, original.value, null); current = current.next; original = original.next; } return result; }
Key advantages:
What are the alternative approaches to object copying that are preferable to implementing the Cloneable interface, and what specific advantages do they offer?
Better alternatives to Cloneable:
Copy Constructor:
public Yum(Yum original) { this.field1 = original.field1; this.field2 = new ArrayList<>(original.field2); // Deep copy if needed // Copy remaining fields }
Static Copy Factory:
public static Yum newInstance(Yum original) { Yum copy = new Yum(); copy.field1 = original.field1; copy.field2 = new ArrayList<>(original.field2); return copy; }
Advantages over Cloneable/clone:
Exception: Arrays are best copied with clone() method.
For thread-safety, what critical step must be taken when implementing the clone()
method in a thread-safe class that implements Cloneable?
For thread-safety in a clone()
method:
// Thread-safe clone implementation @Override public synchronized MyThreadSafeClass clone() { try { MyThreadSafeClass result = (MyThreadSafeClass) super.clone(); // Perform any necessary deep copying here return result; } catch (CloneNotSupportedException e) { throw new AssertionError(); } }
Critical requirements:
synchronized
keyword to the clone methodsuper.clone()
implementation is satisfactory in other respects, it's not synchronizedclone()
method is no exception to this ruleThis ensures thread safety without risking race conditions during the cloning process.
What pattern must be followed when implementing a multi-field comparison with fields of different significance levels?
The critical lexicographic ordering pattern:
Example code:
public int compareTo(ComplexObject obj) { // Compare primary field (most significant) int result = primaryField.compareTo(obj.primaryField); if (result != 0) return result; // Compare secondary field result = Integer.compare(secondaryField, obj.secondaryField); if (result != 0) return result; // Compare tertiary field result = tertiaryField.compareTo(obj.tertiaryField); return result; // Will be 0 if all fields equal }
This pattern ensures: • Minimal field comparisons (early exit optimization) • Proper lexicographic ordering • Transitive comparison results • Stable sorting behavior
Showing 10 of 61 cards. Add this deck to your collection to see all cards.