Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam MuleSoft Integration Architect I Topic 6 Question 26 Discussion

Actual exam question for Salesforce's MuleSoft Integration Architect I exam
Question #: 26
Topic #: 6
[All MuleSoft Integration Architect I Questions]

An organization is designing an integration solution to replicate financial transaction data from a legacy system into a data warehouse (DWH).

The DWH must contain a daily snapshot of financial transactions, to be delivered as a CSV file. Daily transaction volume exceeds tens of millions of records, with significant spikes in volume during popular shopping periods.

What is the most appropriate integration style for an integration solution that meets the organization's current requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Erinn
1 months ago
API-led connectivity? More like API-led confusion if you ask me. Batch ETL all the way, folks. It's the classic choice for a reason.
upvoted 0 times
Gwen
18 days ago
Event-driven architecture might be too complex for this scenario, batch ETL seems like the safer bet.
upvoted 0 times
...
Louvenia
24 days ago
I agree, it's a reliable and efficient choice for this kind of integration.
upvoted 0 times
...
Clare
1 months ago
Batch ETL is definitely the way to go for handling large volumes of data.
upvoted 0 times
...
...
Deeanna
2 months ago
I'm not sure why anyone would even consider microservices for this use case. That's like trying to build a skyscraper with toothpicks!
upvoted 0 times
Tommy
15 days ago
D) Batch-triggered ETL
upvoted 0 times
...
Reynalda
18 days ago
C) API-led connectivity
upvoted 0 times
...
Miles
22 days ago
A) Event-driven architecture
upvoted 0 times
...
...
Irma
2 months ago
Hmm, I don't think an event-driven architecture would work well here. The high transaction volume and need for a daily snapshot make a batch-triggered ETL the logical choice in my opinion.
upvoted 0 times
...
Cordelia
2 months ago
I'm leaning towards option D. With millions of records and spikes in volume, a batch-triggered ETL process seems like the most efficient way to handle the data load without risking performance issues.
upvoted 0 times
Mitsue
24 days ago
I agree with you. Option D seems like the best choice for handling such a large volume of data efficiently.
upvoted 0 times
...
Kallie
27 days ago
D) Batch-triggered ETL
upvoted 0 times
...
Matthew
1 months ago
C) API-led connectivity
upvoted 0 times
...
Pamella
1 months ago
B) Microservice architecture
upvoted 0 times
...
Nathalie
1 months ago
A) Event-driven architecture
upvoted 0 times
...
...
Malcom
2 months ago
That's a good point, but wouldn't Microservice architecture also be able to handle the daily snapshot requirement efficiently?
upvoted 0 times
...
Vincent
2 months ago
I disagree, I believe Event-driven architecture would be better for handling spikes in volume.
upvoted 0 times
...
Malcom
2 months ago
I think the most appropriate integration style would be Batch-triggered ETL.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77